Dec  3 08:17:26 np0005544118 kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec  3 08:17:26 np0005544118 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec  3 08:17:26 np0005544118 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  3 08:17:26 np0005544118 kernel: BIOS-provided physical RAM map:
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec  3 08:17:26 np0005544118 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec  3 08:17:26 np0005544118 kernel: NX (Execute Disable) protection: active
Dec  3 08:17:26 np0005544118 kernel: APIC: Static calls initialized
Dec  3 08:17:26 np0005544118 kernel: SMBIOS 2.8 present.
Dec  3 08:17:26 np0005544118 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec  3 08:17:26 np0005544118 kernel: Hypervisor detected: KVM
Dec  3 08:17:26 np0005544118 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec  3 08:17:26 np0005544118 kernel: kvm-clock: using sched offset of 3422331070 cycles
Dec  3 08:17:26 np0005544118 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec  3 08:17:26 np0005544118 kernel: tsc: Detected 2799.998 MHz processor
Dec  3 08:17:26 np0005544118 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec  3 08:17:26 np0005544118 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec  3 08:17:26 np0005544118 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec  3 08:17:26 np0005544118 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec  3 08:17:26 np0005544118 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec  3 08:17:26 np0005544118 kernel: Using GB pages for direct mapping
Dec  3 08:17:26 np0005544118 kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec  3 08:17:26 np0005544118 kernel: ACPI: Early table checksum verification disabled
Dec  3 08:17:26 np0005544118 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec  3 08:17:26 np0005544118 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  3 08:17:26 np0005544118 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  3 08:17:26 np0005544118 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  3 08:17:26 np0005544118 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec  3 08:17:26 np0005544118 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  3 08:17:26 np0005544118 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec  3 08:17:26 np0005544118 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec  3 08:17:26 np0005544118 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec  3 08:17:26 np0005544118 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec  3 08:17:26 np0005544118 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec  3 08:17:26 np0005544118 kernel: No NUMA configuration found
Dec  3 08:17:26 np0005544118 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec  3 08:17:26 np0005544118 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec  3 08:17:26 np0005544118 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec  3 08:17:26 np0005544118 kernel: Zone ranges:
Dec  3 08:17:26 np0005544118 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec  3 08:17:26 np0005544118 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec  3 08:17:26 np0005544118 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec  3 08:17:26 np0005544118 kernel:  Device   empty
Dec  3 08:17:26 np0005544118 kernel: Movable zone start for each node
Dec  3 08:17:26 np0005544118 kernel: Early memory node ranges
Dec  3 08:17:26 np0005544118 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec  3 08:17:26 np0005544118 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec  3 08:17:26 np0005544118 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec  3 08:17:26 np0005544118 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec  3 08:17:26 np0005544118 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec  3 08:17:26 np0005544118 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec  3 08:17:26 np0005544118 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec  3 08:17:26 np0005544118 kernel: ACPI: PM-Timer IO Port: 0x608
Dec  3 08:17:26 np0005544118 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec  3 08:17:26 np0005544118 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec  3 08:17:26 np0005544118 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec  3 08:17:26 np0005544118 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec  3 08:17:26 np0005544118 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec  3 08:17:26 np0005544118 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec  3 08:17:26 np0005544118 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec  3 08:17:26 np0005544118 kernel: TSC deadline timer available
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Max. logical packages:   8
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Max. logical dies:       8
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Max. dies per package:   1
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Max. threads per core:   1
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Num. cores per package:     1
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Num. threads per package:   1
Dec  3 08:17:26 np0005544118 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec  3 08:17:26 np0005544118 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec  3 08:17:26 np0005544118 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec  3 08:17:26 np0005544118 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec  3 08:17:26 np0005544118 kernel: Booting paravirtualized kernel on KVM
Dec  3 08:17:26 np0005544118 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec  3 08:17:26 np0005544118 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec  3 08:17:26 np0005544118 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec  3 08:17:26 np0005544118 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec  3 08:17:26 np0005544118 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  3 08:17:26 np0005544118 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec  3 08:17:26 np0005544118 kernel: random: crng init done
Dec  3 08:17:26 np0005544118 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: Fallback order for Node 0: 0 
Dec  3 08:17:26 np0005544118 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec  3 08:17:26 np0005544118 kernel: Policy zone: Normal
Dec  3 08:17:26 np0005544118 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec  3 08:17:26 np0005544118 kernel: software IO TLB: area num 8.
Dec  3 08:17:26 np0005544118 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec  3 08:17:26 np0005544118 kernel: ftrace: allocating 49335 entries in 193 pages
Dec  3 08:17:26 np0005544118 kernel: ftrace: allocated 193 pages with 3 groups
Dec  3 08:17:26 np0005544118 kernel: Dynamic Preempt: voluntary
Dec  3 08:17:26 np0005544118 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec  3 08:17:26 np0005544118 kernel: rcu: #011RCU event tracing is enabled.
Dec  3 08:17:26 np0005544118 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec  3 08:17:26 np0005544118 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec  3 08:17:26 np0005544118 kernel: #011Rude variant of Tasks RCU enabled.
Dec  3 08:17:26 np0005544118 kernel: #011Tracing variant of Tasks RCU enabled.
Dec  3 08:17:26 np0005544118 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec  3 08:17:26 np0005544118 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec  3 08:17:26 np0005544118 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  3 08:17:26 np0005544118 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  3 08:17:26 np0005544118 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec  3 08:17:26 np0005544118 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec  3 08:17:26 np0005544118 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec  3 08:17:26 np0005544118 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec  3 08:17:26 np0005544118 kernel: Console: colour VGA+ 80x25
Dec  3 08:17:26 np0005544118 kernel: printk: console [ttyS0] enabled
Dec  3 08:17:26 np0005544118 kernel: ACPI: Core revision 20230331
Dec  3 08:17:26 np0005544118 kernel: APIC: Switch to symmetric I/O mode setup
Dec  3 08:17:26 np0005544118 kernel: x2apic enabled
Dec  3 08:17:26 np0005544118 kernel: APIC: Switched APIC routing to: physical x2apic
Dec  3 08:17:26 np0005544118 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec  3 08:17:26 np0005544118 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec  3 08:17:26 np0005544118 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec  3 08:17:26 np0005544118 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec  3 08:17:26 np0005544118 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec  3 08:17:26 np0005544118 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec  3 08:17:26 np0005544118 kernel: Spectre V2 : Mitigation: Retpolines
Dec  3 08:17:26 np0005544118 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec  3 08:17:26 np0005544118 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec  3 08:17:26 np0005544118 kernel: RETBleed: Mitigation: untrained return thunk
Dec  3 08:17:26 np0005544118 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec  3 08:17:26 np0005544118 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec  3 08:17:26 np0005544118 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec  3 08:17:26 np0005544118 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec  3 08:17:26 np0005544118 kernel: x86/bugs: return thunk changed
Dec  3 08:17:26 np0005544118 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec  3 08:17:26 np0005544118 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec  3 08:17:26 np0005544118 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec  3 08:17:26 np0005544118 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec  3 08:17:26 np0005544118 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec  3 08:17:26 np0005544118 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec  3 08:17:26 np0005544118 kernel: Freeing SMP alternatives memory: 40K
Dec  3 08:17:26 np0005544118 kernel: pid_max: default: 32768 minimum: 301
Dec  3 08:17:26 np0005544118 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec  3 08:17:26 np0005544118 kernel: landlock: Up and running.
Dec  3 08:17:26 np0005544118 kernel: Yama: becoming mindful.
Dec  3 08:17:26 np0005544118 kernel: SELinux:  Initializing.
Dec  3 08:17:26 np0005544118 kernel: LSM support for eBPF active
Dec  3 08:17:26 np0005544118 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec  3 08:17:26 np0005544118 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec  3 08:17:26 np0005544118 kernel: ... version:                0
Dec  3 08:17:26 np0005544118 kernel: ... bit width:              48
Dec  3 08:17:26 np0005544118 kernel: ... generic registers:      6
Dec  3 08:17:26 np0005544118 kernel: ... value mask:             0000ffffffffffff
Dec  3 08:17:26 np0005544118 kernel: ... max period:             00007fffffffffff
Dec  3 08:17:26 np0005544118 kernel: ... fixed-purpose events:   0
Dec  3 08:17:26 np0005544118 kernel: ... event mask:             000000000000003f
Dec  3 08:17:26 np0005544118 kernel: signal: max sigframe size: 1776
Dec  3 08:17:26 np0005544118 kernel: rcu: Hierarchical SRCU implementation.
Dec  3 08:17:26 np0005544118 kernel: rcu: #011Max phase no-delay instances is 400.
Dec  3 08:17:26 np0005544118 kernel: smp: Bringing up secondary CPUs ...
Dec  3 08:17:26 np0005544118 kernel: smpboot: x86: Booting SMP configuration:
Dec  3 08:17:26 np0005544118 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec  3 08:17:26 np0005544118 kernel: smp: Brought up 1 node, 8 CPUs
Dec  3 08:17:26 np0005544118 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec  3 08:17:26 np0005544118 kernel: node 0 deferred pages initialised in 9ms
Dec  3 08:17:26 np0005544118 kernel: Memory: 7763984K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618212K reserved, 0K cma-reserved)
Dec  3 08:17:26 np0005544118 kernel: devtmpfs: initialized
Dec  3 08:17:26 np0005544118 kernel: x86/mm: Memory block size: 128MB
Dec  3 08:17:26 np0005544118 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec  3 08:17:26 np0005544118 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec  3 08:17:26 np0005544118 kernel: pinctrl core: initialized pinctrl subsystem
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec  3 08:17:26 np0005544118 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec  3 08:17:26 np0005544118 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec  3 08:17:26 np0005544118 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec  3 08:17:26 np0005544118 kernel: audit: initializing netlink subsys (disabled)
Dec  3 08:17:26 np0005544118 kernel: audit: type=2000 audit(1764767844.876:1): state=initialized audit_enabled=0 res=1
Dec  3 08:17:26 np0005544118 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec  3 08:17:26 np0005544118 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec  3 08:17:26 np0005544118 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec  3 08:17:26 np0005544118 kernel: cpuidle: using governor menu
Dec  3 08:17:26 np0005544118 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec  3 08:17:26 np0005544118 kernel: PCI: Using configuration type 1 for base access
Dec  3 08:17:26 np0005544118 kernel: PCI: Using configuration type 1 for extended access
Dec  3 08:17:26 np0005544118 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec  3 08:17:26 np0005544118 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec  3 08:17:26 np0005544118 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec  3 08:17:26 np0005544118 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec  3 08:17:26 np0005544118 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec  3 08:17:26 np0005544118 kernel: Demotion targets for Node 0: null
Dec  3 08:17:26 np0005544118 kernel: cryptd: max_cpu_qlen set to 1000
Dec  3 08:17:26 np0005544118 kernel: ACPI: Added _OSI(Module Device)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Added _OSI(Processor Device)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec  3 08:17:26 np0005544118 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec  3 08:17:26 np0005544118 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec  3 08:17:26 np0005544118 kernel: ACPI: Interpreter enabled
Dec  3 08:17:26 np0005544118 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec  3 08:17:26 np0005544118 kernel: ACPI: Using IOAPIC for interrupt routing
Dec  3 08:17:26 np0005544118 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec  3 08:17:26 np0005544118 kernel: PCI: Using E820 reservations for host bridge windows
Dec  3 08:17:26 np0005544118 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec  3 08:17:26 np0005544118 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [3] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [4] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [5] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [6] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [7] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [8] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [9] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [10] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [11] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [12] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [13] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [14] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [15] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [16] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [17] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [18] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [19] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [20] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [21] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [22] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [23] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [24] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [25] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [26] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [27] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [28] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [29] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [30] registered
Dec  3 08:17:26 np0005544118 kernel: acpiphp: Slot [31] registered
Dec  3 08:17:26 np0005544118 kernel: PCI host bridge to bus 0000:00
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec  3 08:17:26 np0005544118 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec  3 08:17:26 np0005544118 kernel: iommu: Default domain type: Translated
Dec  3 08:17:26 np0005544118 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec  3 08:17:26 np0005544118 kernel: SCSI subsystem initialized
Dec  3 08:17:26 np0005544118 kernel: ACPI: bus type USB registered
Dec  3 08:17:26 np0005544118 kernel: usbcore: registered new interface driver usbfs
Dec  3 08:17:26 np0005544118 kernel: usbcore: registered new interface driver hub
Dec  3 08:17:26 np0005544118 kernel: usbcore: registered new device driver usb
Dec  3 08:17:26 np0005544118 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec  3 08:17:26 np0005544118 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec  3 08:17:26 np0005544118 kernel: PTP clock support registered
Dec  3 08:17:26 np0005544118 kernel: EDAC MC: Ver: 3.0.0
Dec  3 08:17:26 np0005544118 kernel: NetLabel: Initializing
Dec  3 08:17:26 np0005544118 kernel: NetLabel:  domain hash size = 128
Dec  3 08:17:26 np0005544118 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec  3 08:17:26 np0005544118 kernel: NetLabel:  unlabeled traffic allowed by default
Dec  3 08:17:26 np0005544118 kernel: PCI: Using ACPI for IRQ routing
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec  3 08:17:26 np0005544118 kernel: vgaarb: loaded
Dec  3 08:17:26 np0005544118 kernel: clocksource: Switched to clocksource kvm-clock
Dec  3 08:17:26 np0005544118 kernel: VFS: Disk quotas dquot_6.6.0
Dec  3 08:17:26 np0005544118 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec  3 08:17:26 np0005544118 kernel: pnp: PnP ACPI init
Dec  3 08:17:26 np0005544118 kernel: pnp: PnP ACPI: found 5 devices
Dec  3 08:17:26 np0005544118 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_INET protocol family
Dec  3 08:17:26 np0005544118 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec  3 08:17:26 np0005544118 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_XDP protocol family
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec  3 08:17:26 np0005544118 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec  3 08:17:26 np0005544118 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec  3 08:17:26 np0005544118 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 86417 usecs
Dec  3 08:17:26 np0005544118 kernel: PCI: CLS 0 bytes, default 64
Dec  3 08:17:26 np0005544118 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec  3 08:17:26 np0005544118 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec  3 08:17:26 np0005544118 kernel: ACPI: bus type thunderbolt registered
Dec  3 08:17:26 np0005544118 kernel: Trying to unpack rootfs image as initramfs...
Dec  3 08:17:26 np0005544118 kernel: Initialise system trusted keyrings
Dec  3 08:17:26 np0005544118 kernel: Key type blacklist registered
Dec  3 08:17:26 np0005544118 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec  3 08:17:26 np0005544118 kernel: zbud: loaded
Dec  3 08:17:26 np0005544118 kernel: integrity: Platform Keyring initialized
Dec  3 08:17:26 np0005544118 kernel: integrity: Machine keyring initialized
Dec  3 08:17:26 np0005544118 kernel: Freeing initrd memory: 87804K
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_ALG protocol family
Dec  3 08:17:26 np0005544118 kernel: xor: automatically using best checksumming function   avx       
Dec  3 08:17:26 np0005544118 kernel: Key type asymmetric registered
Dec  3 08:17:26 np0005544118 kernel: Asymmetric key parser 'x509' registered
Dec  3 08:17:26 np0005544118 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec  3 08:17:26 np0005544118 kernel: io scheduler mq-deadline registered
Dec  3 08:17:26 np0005544118 kernel: io scheduler kyber registered
Dec  3 08:17:26 np0005544118 kernel: io scheduler bfq registered
Dec  3 08:17:26 np0005544118 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec  3 08:17:26 np0005544118 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec  3 08:17:26 np0005544118 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec  3 08:17:26 np0005544118 kernel: ACPI: button: Power Button [PWRF]
Dec  3 08:17:26 np0005544118 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec  3 08:17:26 np0005544118 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec  3 08:17:26 np0005544118 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec  3 08:17:26 np0005544118 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec  3 08:17:26 np0005544118 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec  3 08:17:26 np0005544118 kernel: Non-volatile memory driver v1.3
Dec  3 08:17:26 np0005544118 kernel: rdac: device handler registered
Dec  3 08:17:26 np0005544118 kernel: hp_sw: device handler registered
Dec  3 08:17:26 np0005544118 kernel: emc: device handler registered
Dec  3 08:17:26 np0005544118 kernel: alua: device handler registered
Dec  3 08:17:26 np0005544118 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec  3 08:17:26 np0005544118 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec  3 08:17:26 np0005544118 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec  3 08:17:26 np0005544118 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec  3 08:17:26 np0005544118 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec  3 08:17:26 np0005544118 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec  3 08:17:26 np0005544118 kernel: usb usb1: Product: UHCI Host Controller
Dec  3 08:17:26 np0005544118 kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec  3 08:17:26 np0005544118 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec  3 08:17:26 np0005544118 kernel: hub 1-0:1.0: USB hub found
Dec  3 08:17:26 np0005544118 kernel: hub 1-0:1.0: 2 ports detected
Dec  3 08:17:26 np0005544118 kernel: usbcore: registered new interface driver usbserial_generic
Dec  3 08:17:26 np0005544118 kernel: usbserial: USB Serial support registered for generic
Dec  3 08:17:26 np0005544118 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec  3 08:17:26 np0005544118 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec  3 08:17:26 np0005544118 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec  3 08:17:26 np0005544118 kernel: mousedev: PS/2 mouse device common for all mice
Dec  3 08:17:26 np0005544118 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec  3 08:17:26 np0005544118 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec  3 08:17:26 np0005544118 kernel: rtc_cmos 00:04: registered as rtc0
Dec  3 08:17:26 np0005544118 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec  3 08:17:26 np0005544118 kernel: rtc_cmos 00:04: setting system clock to 2025-12-03T13:17:25 UTC (1764767845)
Dec  3 08:17:26 np0005544118 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec  3 08:17:26 np0005544118 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec  3 08:17:26 np0005544118 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec  3 08:17:26 np0005544118 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec  3 08:17:26 np0005544118 kernel: usbcore: registered new interface driver usbhid
Dec  3 08:17:26 np0005544118 kernel: usbhid: USB HID core driver
Dec  3 08:17:26 np0005544118 kernel: drop_monitor: Initializing network drop monitor service
Dec  3 08:17:26 np0005544118 kernel: Initializing XFRM netlink socket
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_INET6 protocol family
Dec  3 08:17:26 np0005544118 kernel: Segment Routing with IPv6
Dec  3 08:17:26 np0005544118 kernel: NET: Registered PF_PACKET protocol family
Dec  3 08:17:26 np0005544118 kernel: mpls_gso: MPLS GSO support
Dec  3 08:17:26 np0005544118 kernel: IPI shorthand broadcast: enabled
Dec  3 08:17:26 np0005544118 kernel: AVX2 version of gcm_enc/dec engaged.
Dec  3 08:17:26 np0005544118 kernel: AES CTR mode by8 optimization enabled
Dec  3 08:17:26 np0005544118 kernel: sched_clock: Marking stable (1210028162, 145342774)->(1491008855, -135637919)
Dec  3 08:17:26 np0005544118 kernel: registered taskstats version 1
Dec  3 08:17:26 np0005544118 kernel: Loading compiled-in X.509 certificates
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec  3 08:17:26 np0005544118 kernel: Demotion targets for Node 0: null
Dec  3 08:17:26 np0005544118 kernel: page_owner is disabled
Dec  3 08:17:26 np0005544118 kernel: Key type .fscrypt registered
Dec  3 08:17:26 np0005544118 kernel: Key type fscrypt-provisioning registered
Dec  3 08:17:26 np0005544118 kernel: Key type big_key registered
Dec  3 08:17:26 np0005544118 kernel: Key type encrypted registered
Dec  3 08:17:26 np0005544118 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec  3 08:17:26 np0005544118 kernel: Loading compiled-in module X.509 certificates
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec  3 08:17:26 np0005544118 kernel: ima: Allocated hash algorithm: sha256
Dec  3 08:17:26 np0005544118 kernel: ima: No architecture policies found
Dec  3 08:17:26 np0005544118 kernel: evm: Initialising EVM extended attributes:
Dec  3 08:17:26 np0005544118 kernel: evm: security.selinux
Dec  3 08:17:26 np0005544118 kernel: evm: security.SMACK64 (disabled)
Dec  3 08:17:26 np0005544118 kernel: evm: security.SMACK64EXEC (disabled)
Dec  3 08:17:26 np0005544118 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec  3 08:17:26 np0005544118 kernel: evm: security.SMACK64MMAP (disabled)
Dec  3 08:17:26 np0005544118 kernel: evm: security.apparmor (disabled)
Dec  3 08:17:26 np0005544118 kernel: evm: security.ima
Dec  3 08:17:26 np0005544118 kernel: evm: security.capability
Dec  3 08:17:26 np0005544118 kernel: evm: HMAC attrs: 0x1
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec  3 08:17:26 np0005544118 kernel: Running certificate verification RSA selftest
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec  3 08:17:26 np0005544118 kernel: Running certificate verification ECDSA selftest
Dec  3 08:17:26 np0005544118 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec  3 08:17:26 np0005544118 kernel: clk: Disabling unused clocks
Dec  3 08:17:26 np0005544118 kernel: Freeing unused decrypted memory: 2028K
Dec  3 08:17:26 np0005544118 kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec  3 08:17:26 np0005544118 kernel: Write protecting the kernel read-only data: 30720k
Dec  3 08:17:26 np0005544118 kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec  3 08:17:26 np0005544118 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec  3 08:17:26 np0005544118 kernel: Run /init as init process
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: Product: QEMU USB Tablet
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: Manufacturer: QEMU
Dec  3 08:17:26 np0005544118 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec  3 08:17:26 np0005544118 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  3 08:17:26 np0005544118 systemd: Detected virtualization kvm.
Dec  3 08:17:26 np0005544118 systemd: Detected architecture x86-64.
Dec  3 08:17:26 np0005544118 systemd: Running in initrd.
Dec  3 08:17:26 np0005544118 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec  3 08:17:26 np0005544118 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec  3 08:17:26 np0005544118 systemd: No hostname configured, using default hostname.
Dec  3 08:17:26 np0005544118 systemd: Hostname set to <localhost>.
Dec  3 08:17:26 np0005544118 systemd: Initializing machine ID from VM UUID.
Dec  3 08:17:26 np0005544118 systemd: Queued start job for default target Initrd Default Target.
Dec  3 08:17:26 np0005544118 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  3 08:17:26 np0005544118 systemd: Reached target Local Encrypted Volumes.
Dec  3 08:17:26 np0005544118 systemd: Reached target Initrd /usr File System.
Dec  3 08:17:26 np0005544118 systemd: Reached target Local File Systems.
Dec  3 08:17:26 np0005544118 systemd: Reached target Path Units.
Dec  3 08:17:26 np0005544118 systemd: Reached target Slice Units.
Dec  3 08:17:26 np0005544118 systemd: Reached target Swaps.
Dec  3 08:17:26 np0005544118 systemd: Reached target Timer Units.
Dec  3 08:17:26 np0005544118 systemd: Listening on D-Bus System Message Bus Socket.
Dec  3 08:17:26 np0005544118 systemd: Listening on Journal Socket (/dev/log).
Dec  3 08:17:26 np0005544118 systemd: Listening on Journal Socket.
Dec  3 08:17:26 np0005544118 systemd: Listening on udev Control Socket.
Dec  3 08:17:26 np0005544118 systemd: Listening on udev Kernel Socket.
Dec  3 08:17:26 np0005544118 systemd: Reached target Socket Units.
Dec  3 08:17:26 np0005544118 systemd: Starting Create List of Static Device Nodes...
Dec  3 08:17:26 np0005544118 systemd: Starting Journal Service...
Dec  3 08:17:26 np0005544118 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  3 08:17:26 np0005544118 systemd: Starting Apply Kernel Variables...
Dec  3 08:17:26 np0005544118 systemd: Starting Create System Users...
Dec  3 08:17:26 np0005544118 systemd: Starting Setup Virtual Console...
Dec  3 08:17:26 np0005544118 systemd: Finished Create List of Static Device Nodes.
Dec  3 08:17:26 np0005544118 systemd: Finished Apply Kernel Variables.
Dec  3 08:17:26 np0005544118 systemd: Finished Create System Users.
Dec  3 08:17:26 np0005544118 systemd: Starting Create Static Device Nodes in /dev...
Dec  3 08:17:26 np0005544118 systemd-journald[304]: Journal started
Dec  3 08:17:26 np0005544118 systemd-journald[304]: Runtime Journal (/run/log/journal/5f58e40a1e39483895dfccf3b2b3eaa6) is 8.0M, max 153.6M, 145.6M free.
Dec  3 08:17:26 np0005544118 systemd-sysusers[309]: Creating group 'users' with GID 100.
Dec  3 08:17:26 np0005544118 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Dec  3 08:17:26 np0005544118 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec  3 08:17:26 np0005544118 systemd: Started Journal Service.
Dec  3 08:17:26 np0005544118 systemd[1]: Starting Create Volatile Files and Directories...
Dec  3 08:17:26 np0005544118 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  3 08:17:26 np0005544118 systemd[1]: Finished Create Volatile Files and Directories.
Dec  3 08:17:26 np0005544118 systemd[1]: Finished Setup Virtual Console.
Dec  3 08:17:26 np0005544118 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec  3 08:17:26 np0005544118 systemd[1]: Starting dracut cmdline hook...
Dec  3 08:17:26 np0005544118 dracut-cmdline[324]: dracut-9 dracut-057-102.git20250818.el9
Dec  3 08:17:26 np0005544118 dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec  3 08:17:26 np0005544118 systemd[1]: Finished dracut cmdline hook.
Dec  3 08:17:26 np0005544118 systemd[1]: Starting dracut pre-udev hook...
Dec  3 08:17:26 np0005544118 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec  3 08:17:26 np0005544118 kernel: device-mapper: uevent: version 1.0.3
Dec  3 08:17:26 np0005544118 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec  3 08:17:26 np0005544118 kernel: RPC: Registered named UNIX socket transport module.
Dec  3 08:17:26 np0005544118 kernel: RPC: Registered udp transport module.
Dec  3 08:17:26 np0005544118 kernel: RPC: Registered tcp transport module.
Dec  3 08:17:26 np0005544118 kernel: RPC: Registered tcp-with-tls transport module.
Dec  3 08:17:26 np0005544118 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec  3 08:17:26 np0005544118 rpc.statd[441]: Version 2.5.4 starting
Dec  3 08:17:26 np0005544118 rpc.statd[441]: Initializing NSM state
Dec  3 08:17:26 np0005544118 rpc.idmapd[446]: Setting log level to 0
Dec  3 08:17:26 np0005544118 systemd[1]: Finished dracut pre-udev hook.
Dec  3 08:17:26 np0005544118 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  3 08:17:27 np0005544118 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Dec  3 08:17:27 np0005544118 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  3 08:17:27 np0005544118 systemd[1]: Starting dracut pre-trigger hook...
Dec  3 08:17:27 np0005544118 systemd[1]: Finished dracut pre-trigger hook.
Dec  3 08:17:27 np0005544118 systemd[1]: Starting Coldplug All udev Devices...
Dec  3 08:17:27 np0005544118 systemd[1]: Created slice Slice /system/modprobe.
Dec  3 08:17:27 np0005544118 systemd[1]: Starting Load Kernel Module configfs...
Dec  3 08:17:27 np0005544118 systemd[1]: Finished Coldplug All udev Devices.
Dec  3 08:17:27 np0005544118 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Network.
Dec  3 08:17:27 np0005544118 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec  3 08:17:27 np0005544118 systemd[1]: Starting dracut initqueue hook...
Dec  3 08:17:27 np0005544118 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  3 08:17:27 np0005544118 systemd[1]: Finished Load Kernel Module configfs.
Dec  3 08:17:27 np0005544118 systemd[1]: Mounting Kernel Configuration File System...
Dec  3 08:17:27 np0005544118 systemd[1]: Mounted Kernel Configuration File System.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target System Initialization.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Basic System.
Dec  3 08:17:27 np0005544118 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec  3 08:17:27 np0005544118 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec  3 08:17:27 np0005544118 kernel: vda: vda1
Dec  3 08:17:27 np0005544118 systemd-udevd[483]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 08:17:27 np0005544118 systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Initrd Root Device.
Dec  3 08:17:27 np0005544118 kernel: scsi host0: ata_piix
Dec  3 08:17:27 np0005544118 kernel: scsi host1: ata_piix
Dec  3 08:17:27 np0005544118 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec  3 08:17:27 np0005544118 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec  3 08:17:27 np0005544118 kernel: ata1: found unknown device (class 0)
Dec  3 08:17:27 np0005544118 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec  3 08:17:27 np0005544118 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec  3 08:17:27 np0005544118 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec  3 08:17:27 np0005544118 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec  3 08:17:27 np0005544118 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec  3 08:17:27 np0005544118 systemd[1]: Finished dracut initqueue hook.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Remote Encrypted Volumes.
Dec  3 08:17:27 np0005544118 systemd[1]: Reached target Remote File Systems.
Dec  3 08:17:27 np0005544118 systemd[1]: Starting dracut pre-mount hook...
Dec  3 08:17:27 np0005544118 systemd[1]: Finished dracut pre-mount hook.
Dec  3 08:17:27 np0005544118 systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec  3 08:17:27 np0005544118 systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Dec  3 08:17:27 np0005544118 systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec  3 08:17:27 np0005544118 systemd[1]: Mounting /sysroot...
Dec  3 08:17:28 np0005544118 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec  3 08:17:28 np0005544118 kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec  3 08:17:28 np0005544118 kernel: XFS (vda1): Ending clean mount
Dec  3 08:17:28 np0005544118 systemd[1]: Mounted /sysroot.
Dec  3 08:17:28 np0005544118 systemd[1]: Reached target Initrd Root File System.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec  3 08:17:28 np0005544118 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec  3 08:17:28 np0005544118 systemd[1]: Reached target Initrd File Systems.
Dec  3 08:17:28 np0005544118 systemd[1]: Reached target Initrd Default Target.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting dracut mount hook...
Dec  3 08:17:28 np0005544118 systemd[1]: Finished dracut mount hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec  3 08:17:28 np0005544118 rpc.idmapd[446]: exiting on signal 15
Dec  3 08:17:28 np0005544118 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Network.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Timer Units.
Dec  3 08:17:28 np0005544118 systemd[1]: dbus.socket: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Initrd Default Target.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Basic System.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Initrd Root Device.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Initrd /usr File System.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Path Units.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Remote File Systems.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Slice Units.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Socket Units.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target System Initialization.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Local File Systems.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Swaps.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut mount hook.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut pre-mount hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped target Local Encrypted Volumes.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut initqueue hook.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Apply Kernel Variables.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Create Volatile Files and Directories.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Coldplug All udev Devices.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut pre-trigger hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Setup Virtual Console.
Dec  3 08:17:28 np0005544118 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec  3 08:17:28 np0005544118 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Closed udev Control Socket.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Closed udev Kernel Socket.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut pre-udev hook.
Dec  3 08:17:28 np0005544118 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped dracut cmdline hook.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting Cleanup udev Database...
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec  3 08:17:28 np0005544118 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Create List of Static Device Nodes.
Dec  3 08:17:28 np0005544118 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Stopped Create System Users.
Dec  3 08:17:28 np0005544118 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec  3 08:17:28 np0005544118 systemd[1]: Finished Cleanup udev Database.
Dec  3 08:17:28 np0005544118 systemd[1]: Reached target Switch Root.
Dec  3 08:17:28 np0005544118 systemd[1]: Starting Switch Root...
Dec  3 08:17:28 np0005544118 systemd[1]: Switching root.
Dec  3 08:17:28 np0005544118 systemd-journald[304]: Journal stopped
Dec  3 08:17:29 np0005544118 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec  3 08:17:29 np0005544118 kernel: audit: type=1404 audit(1764767848.639:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:17:29 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:17:29 np0005544118 kernel: audit: type=1403 audit(1764767848.765:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec  3 08:17:29 np0005544118 systemd: Successfully loaded SELinux policy in 130.411ms.
Dec  3 08:17:29 np0005544118 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 28.085ms.
Dec  3 08:17:29 np0005544118 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec  3 08:17:29 np0005544118 systemd: Detected virtualization kvm.
Dec  3 08:17:29 np0005544118 systemd: Detected architecture x86-64.
Dec  3 08:17:29 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:17:29 np0005544118 systemd: initrd-switch-root.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd: Stopped Switch Root.
Dec  3 08:17:29 np0005544118 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec  3 08:17:29 np0005544118 systemd: Created slice Slice /system/getty.
Dec  3 08:17:29 np0005544118 systemd: Created slice Slice /system/serial-getty.
Dec  3 08:17:29 np0005544118 systemd: Created slice Slice /system/sshd-keygen.
Dec  3 08:17:29 np0005544118 systemd: Created slice User and Session Slice.
Dec  3 08:17:29 np0005544118 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec  3 08:17:29 np0005544118 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec  3 08:17:29 np0005544118 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec  3 08:17:29 np0005544118 systemd: Reached target Local Encrypted Volumes.
Dec  3 08:17:29 np0005544118 systemd: Stopped target Switch Root.
Dec  3 08:17:29 np0005544118 systemd: Stopped target Initrd File Systems.
Dec  3 08:17:29 np0005544118 systemd: Stopped target Initrd Root File System.
Dec  3 08:17:29 np0005544118 systemd: Reached target Local Integrity Protected Volumes.
Dec  3 08:17:29 np0005544118 systemd: Reached target Path Units.
Dec  3 08:17:29 np0005544118 systemd: Reached target rpc_pipefs.target.
Dec  3 08:17:29 np0005544118 systemd: Reached target Slice Units.
Dec  3 08:17:29 np0005544118 systemd: Reached target Swaps.
Dec  3 08:17:29 np0005544118 systemd: Reached target Local Verity Protected Volumes.
Dec  3 08:17:29 np0005544118 systemd: Listening on RPCbind Server Activation Socket.
Dec  3 08:17:29 np0005544118 systemd: Reached target RPC Port Mapper.
Dec  3 08:17:29 np0005544118 systemd: Listening on Process Core Dump Socket.
Dec  3 08:17:29 np0005544118 systemd: Listening on initctl Compatibility Named Pipe.
Dec  3 08:17:29 np0005544118 systemd: Listening on udev Control Socket.
Dec  3 08:17:29 np0005544118 systemd: Listening on udev Kernel Socket.
Dec  3 08:17:29 np0005544118 systemd: Mounting Huge Pages File System...
Dec  3 08:17:29 np0005544118 systemd: Mounting POSIX Message Queue File System...
Dec  3 08:17:29 np0005544118 systemd: Mounting Kernel Debug File System...
Dec  3 08:17:29 np0005544118 systemd: Mounting Kernel Trace File System...
Dec  3 08:17:29 np0005544118 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  3 08:17:29 np0005544118 systemd: Starting Create List of Static Device Nodes...
Dec  3 08:17:29 np0005544118 systemd: Starting Load Kernel Module configfs...
Dec  3 08:17:29 np0005544118 systemd: Starting Load Kernel Module drm...
Dec  3 08:17:29 np0005544118 systemd: Starting Load Kernel Module efi_pstore...
Dec  3 08:17:29 np0005544118 systemd: Starting Load Kernel Module fuse...
Dec  3 08:17:29 np0005544118 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec  3 08:17:29 np0005544118 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd: Stopped File System Check on Root Device.
Dec  3 08:17:29 np0005544118 systemd: Stopped Journal Service.
Dec  3 08:17:29 np0005544118 kernel: fuse: init (API version 7.37)
Dec  3 08:17:29 np0005544118 systemd: Starting Journal Service...
Dec  3 08:17:29 np0005544118 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec  3 08:17:29 np0005544118 systemd: Starting Generate network units from Kernel command line...
Dec  3 08:17:29 np0005544118 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  3 08:17:29 np0005544118 systemd: Starting Remount Root and Kernel File Systems...
Dec  3 08:17:29 np0005544118 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec  3 08:17:29 np0005544118 systemd: Starting Apply Kernel Variables...
Dec  3 08:17:29 np0005544118 systemd-journald[679]: Journal started
Dec  3 08:17:29 np0005544118 systemd-journald[679]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  3 08:17:29 np0005544118 systemd[1]: Queued start job for default target Multi-User System.
Dec  3 08:17:29 np0005544118 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd: Starting Coldplug All udev Devices...
Dec  3 08:17:29 np0005544118 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec  3 08:17:29 np0005544118 systemd: Started Journal Service.
Dec  3 08:17:29 np0005544118 systemd[1]: Mounted Huge Pages File System.
Dec  3 08:17:29 np0005544118 systemd[1]: Mounted POSIX Message Queue File System.
Dec  3 08:17:29 np0005544118 systemd[1]: Mounted Kernel Debug File System.
Dec  3 08:17:29 np0005544118 systemd[1]: Mounted Kernel Trace File System.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Create List of Static Device Nodes.
Dec  3 08:17:29 np0005544118 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Load Kernel Module configfs.
Dec  3 08:17:29 np0005544118 kernel: ACPI: bus type drm_connector registered
Dec  3 08:17:29 np0005544118 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec  3 08:17:29 np0005544118 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Load Kernel Module drm.
Dec  3 08:17:29 np0005544118 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Load Kernel Module fuse.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Generate network units from Kernel command line.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Apply Kernel Variables.
Dec  3 08:17:29 np0005544118 systemd[1]: Mounting FUSE Control File System...
Dec  3 08:17:29 np0005544118 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Rebuild Hardware Database...
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec  3 08:17:29 np0005544118 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Load/Save OS Random Seed...
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Create System Users...
Dec  3 08:17:29 np0005544118 systemd[1]: Mounted FUSE Control File System.
Dec  3 08:17:29 np0005544118 systemd-journald[679]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec  3 08:17:29 np0005544118 systemd-journald[679]: Received client request to flush runtime journal.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Load/Save OS Random Seed.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec  3 08:17:29 np0005544118 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Coldplug All udev Devices.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Create System Users.
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec  3 08:17:29 np0005544118 systemd[1]: Reached target Preparation for Local File Systems.
Dec  3 08:17:29 np0005544118 systemd[1]: Reached target Local File Systems.
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec  3 08:17:29 np0005544118 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec  3 08:17:29 np0005544118 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec  3 08:17:29 np0005544118 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Automatic Boot Loader Update...
Dec  3 08:17:29 np0005544118 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Create Volatile Files and Directories...
Dec  3 08:17:29 np0005544118 bootctl[697]: Couldn't find EFI system partition, skipping.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Automatic Boot Loader Update.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Create Volatile Files and Directories.
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Security Auditing Service...
Dec  3 08:17:29 np0005544118 systemd[1]: Starting RPC Bind...
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Rebuild Journal Catalog...
Dec  3 08:17:29 np0005544118 auditd[704]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec  3 08:17:29 np0005544118 auditd[704]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Rebuild Journal Catalog.
Dec  3 08:17:29 np0005544118 systemd[1]: Started RPC Bind.
Dec  3 08:17:29 np0005544118 augenrules[709]: /sbin/augenrules: No change
Dec  3 08:17:29 np0005544118 augenrules[724]: No rules
Dec  3 08:17:29 np0005544118 augenrules[724]: enabled 1
Dec  3 08:17:29 np0005544118 augenrules[724]: failure 1
Dec  3 08:17:29 np0005544118 augenrules[724]: pid 704
Dec  3 08:17:29 np0005544118 augenrules[724]: rate_limit 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_limit 8192
Dec  3 08:17:29 np0005544118 augenrules[724]: lost 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog 1
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time 60000
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time_actual 0
Dec  3 08:17:29 np0005544118 augenrules[724]: enabled 1
Dec  3 08:17:29 np0005544118 augenrules[724]: failure 1
Dec  3 08:17:29 np0005544118 augenrules[724]: pid 704
Dec  3 08:17:29 np0005544118 augenrules[724]: rate_limit 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_limit 8192
Dec  3 08:17:29 np0005544118 augenrules[724]: lost 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time 60000
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time_actual 0
Dec  3 08:17:29 np0005544118 augenrules[724]: enabled 1
Dec  3 08:17:29 np0005544118 augenrules[724]: failure 1
Dec  3 08:17:29 np0005544118 augenrules[724]: pid 704
Dec  3 08:17:29 np0005544118 augenrules[724]: rate_limit 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_limit 8192
Dec  3 08:17:29 np0005544118 augenrules[724]: lost 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog 0
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time 60000
Dec  3 08:17:29 np0005544118 augenrules[724]: backlog_wait_time_actual 0
Dec  3 08:17:29 np0005544118 systemd[1]: Started Security Auditing Service.
Dec  3 08:17:29 np0005544118 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec  3 08:17:29 np0005544118 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec  3 08:17:30 np0005544118 systemd[1]: Finished Rebuild Hardware Database.
Dec  3 08:17:30 np0005544118 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec  3 08:17:30 np0005544118 systemd[1]: Starting Update is Completed...
Dec  3 08:17:30 np0005544118 systemd[1]: Finished Update is Completed.
Dec  3 08:17:30 np0005544118 systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Dec  3 08:17:30 np0005544118 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target System Initialization.
Dec  3 08:17:30 np0005544118 systemd[1]: Started dnf makecache --timer.
Dec  3 08:17:30 np0005544118 systemd[1]: Started Daily rotation of log files.
Dec  3 08:17:30 np0005544118 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target Timer Units.
Dec  3 08:17:30 np0005544118 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec  3 08:17:30 np0005544118 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target Socket Units.
Dec  3 08:17:30 np0005544118 systemd[1]: Starting D-Bus System Message Bus...
Dec  3 08:17:30 np0005544118 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  3 08:17:30 np0005544118 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec  3 08:17:30 np0005544118 systemd[1]: Starting Load Kernel Module configfs...
Dec  3 08:17:30 np0005544118 systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 08:17:30 np0005544118 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec  3 08:17:30 np0005544118 systemd[1]: Finished Load Kernel Module configfs.
Dec  3 08:17:30 np0005544118 systemd[1]: Started D-Bus System Message Bus.
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target Basic System.
Dec  3 08:17:30 np0005544118 dbus-broker-lau[771]: Ready
Dec  3 08:17:30 np0005544118 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec  3 08:17:30 np0005544118 systemd[1]: Starting NTP client/server...
Dec  3 08:17:30 np0005544118 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec  3 08:17:30 np0005544118 chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  3 08:17:30 np0005544118 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec  3 08:17:30 np0005544118 chronyd[792]: Loaded 0 symmetric keys
Dec  3 08:17:30 np0005544118 systemd[1]: Starting IPv4 firewall with iptables...
Dec  3 08:17:30 np0005544118 chronyd[792]: Using right/UTC timezone to obtain leap second data
Dec  3 08:17:30 np0005544118 systemd[1]: Started irqbalance daemon.
Dec  3 08:17:30 np0005544118 chronyd[792]: Loaded seccomp filter (level 2)
Dec  3 08:17:30 np0005544118 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec  3 08:17:30 np0005544118 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 08:17:30 np0005544118 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 08:17:30 np0005544118 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target sshd-keygen.target.
Dec  3 08:17:30 np0005544118 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec  3 08:17:30 np0005544118 systemd[1]: Reached target User and Group Name Lookups.
Dec  3 08:17:30 np0005544118 systemd[1]: Starting User Login Management...
Dec  3 08:17:30 np0005544118 systemd[1]: Started NTP client/server.
Dec  3 08:17:30 np0005544118 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec  3 08:17:30 np0005544118 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec  3 08:17:30 np0005544118 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec  3 08:17:30 np0005544118 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec  3 08:17:30 np0005544118 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  3 08:17:30 np0005544118 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  3 08:17:30 np0005544118 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec  3 08:17:30 np0005544118 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec  3 08:17:30 np0005544118 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec  3 08:17:30 np0005544118 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec  3 08:17:30 np0005544118 kernel: kvm_amd: TSC scaling supported
Dec  3 08:17:30 np0005544118 kernel: kvm_amd: Nested Virtualization enabled
Dec  3 08:17:30 np0005544118 kernel: kvm_amd: Nested Paging enabled
Dec  3 08:17:30 np0005544118 kernel: kvm_amd: LBR virtualization supported
Dec  3 08:17:30 np0005544118 systemd-logind[795]: New seat seat0.
Dec  3 08:17:30 np0005544118 systemd[1]: Started User Login Management.
Dec  3 08:17:30 np0005544118 kernel: Console: switching to colour dummy device 80x25
Dec  3 08:17:30 np0005544118 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec  3 08:17:30 np0005544118 kernel: [drm] features: -context_init
Dec  3 08:17:30 np0005544118 kernel: [drm] number of scanouts: 1
Dec  3 08:17:30 np0005544118 kernel: [drm] number of cap sets: 0
Dec  3 08:17:30 np0005544118 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec  3 08:17:30 np0005544118 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec  3 08:17:30 np0005544118 kernel: Console: switching to colour frame buffer device 128x48
Dec  3 08:17:30 np0005544118 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec  3 08:17:30 np0005544118 iptables.init[783]: iptables: Applying firewall rules: [  OK  ]
Dec  3 08:17:30 np0005544118 systemd[1]: Finished IPv4 firewall with iptables.
Dec  3 08:17:30 np0005544118 cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Wed, 03 Dec 2025 13:17:30 +0000. Up 6.40 seconds.
Dec  3 08:17:30 np0005544118 systemd[1]: run-cloud\x2dinit-tmp-tmpaz49x07_.mount: Deactivated successfully.
Dec  3 08:17:31 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 08:17:31 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 08:17:31 np0005544118 systemd-hostnamed[855]: Hostname set to <np0005544118.novalocal> (static)
Dec  3 08:17:31 np0005544118 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec  3 08:17:31 np0005544118 systemd[1]: Reached target Preparation for Network.
Dec  3 08:17:31 np0005544118 systemd[1]: Starting Network Manager...
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2646] NetworkManager (version 1.54.1-1.el9) is starting... (boot:0378e26f-1df1-4cca-950c-062911188078)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2653] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2738] manager[0x55a19a120080]: monitoring kernel firmware directory '/lib/firmware'.
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2787] hostname: hostname: using hostnamed
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2788] hostname: static hostname changed from (none) to "np0005544118.novalocal"
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2794] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2939] manager[0x55a19a120080]: rfkill: Wi-Fi hardware radio set enabled
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2940] manager[0x55a19a120080]: rfkill: WWAN hardware radio set enabled
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2984] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2985] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2985] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2986] manager: Networking is enabled by state file
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2988] settings: Loaded settings plugin: keyfile (internal)
Dec  3 08:17:31 np0005544118 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.2997] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3017] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3028] dhcp: init: Using DHCP client 'internal'
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3030] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3043] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3051] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3058] device (lo): Activation: starting connection 'lo' (392164e0-7fb3-4d91-9407-dec18de6b483)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3066] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3069] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3098] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3103] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3106] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3108] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3111] device (eth0): carrier: link connected
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3115] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3121] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3127] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3132] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3132] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3134] manager: NetworkManager state is now CONNECTING
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3135] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3140] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3143] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:17:31 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:17:31 np0005544118 systemd[1]: Started Network Manager.
Dec  3 08:17:31 np0005544118 systemd[1]: Reached target Network.
Dec  3 08:17:31 np0005544118 systemd[1]: Starting Network Manager Wait Online...
Dec  3 08:17:31 np0005544118 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec  3 08:17:31 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:17:31 np0005544118 systemd[1]: Started GSSAPI Proxy Daemon.
Dec  3 08:17:31 np0005544118 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec  3 08:17:31 np0005544118 systemd[1]: Reached target NFS client services.
Dec  3 08:17:31 np0005544118 systemd[1]: Reached target Preparation for Remote File Systems.
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3450] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3456] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  3 08:17:31 np0005544118 NetworkManager[859]: <info>  [1764767851.3465] device (lo): Activation: successful, device activated.
Dec  3 08:17:31 np0005544118 systemd[1]: Reached target Remote File Systems.
Dec  3 08:17:31 np0005544118 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2610] dhcp4 (eth0): state changed new lease, address=38.102.83.106
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2619] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2640] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2674] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2675] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2678] manager: NetworkManager state is now CONNECTED_SITE
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2680] device (eth0): Activation: successful, device activated.
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2684] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  3 08:17:33 np0005544118 NetworkManager[859]: <info>  [1764767853.2685] manager: startup complete
Dec  3 08:17:33 np0005544118 systemd[1]: Finished Network Manager Wait Online.
Dec  3 08:17:33 np0005544118 systemd[1]: Starting Cloud-init: Network Stage...
Dec  3 08:17:33 np0005544118 cloud-init[921]: Cloud-init v. 24.4-7.el9 running 'init' at Wed, 03 Dec 2025 13:17:33 +0000. Up 9.23 seconds.
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |  eth0  | True |        38.102.83.106         | 255.255.255.0 | global | fa:16:3e:34:f1:b0 |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |  eth0  | True | fe80::f816:3eff:fe34:f1b0/64 |       .       |  link  | fa:16:3e:34:f1:b0 |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Dec  3 08:17:33 np0005544118 cloud-init[921]: ci-info: +-------+-------------+---------+-----------+-------+
Dec  3 08:17:34 np0005544118 cloud-init[921]: Generating public/private rsa key pair.
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key fingerprint is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: SHA256:7zvgOcHLN6MFaV57LAvcJLK3g0YbXu0yX7WT1favpnM root@np0005544118.novalocal
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key's randomart image is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: +---[RSA 3072]----+
Dec  3 08:17:34 np0005544118 cloud-init[921]: |                 |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |                 |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |                 |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |         .      .|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |      ..S.o   . +|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |      o*=B.o . =.|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |     o.B=** + + .|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |      =.X=*=. E..|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |     .  oO==.=...|
Dec  3 08:17:34 np0005544118 cloud-init[921]: +----[SHA256]-----+
Dec  3 08:17:34 np0005544118 cloud-init[921]: Generating public/private ecdsa key pair.
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key fingerprint is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: SHA256:sHcg6FT6Bm5KxS+BR9sRFeHFet05bPrT8DvkXG7msRE root@np0005544118.novalocal
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key's randomart image is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: +---[ECDSA 256]---+
Dec  3 08:17:34 np0005544118 cloud-init[921]: |    . +o++.      |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |   + * o ..      |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |  . @ + o. . o . |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |   * = +... . *  |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |  . = = S..  o E |
Dec  3 08:17:34 np0005544118 cloud-init[921]: | . o o . .  . ..o|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |  .          .+B.|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |              o+X|
Dec  3 08:17:34 np0005544118 cloud-init[921]: |               B+|
Dec  3 08:17:34 np0005544118 cloud-init[921]: +----[SHA256]-----+
Dec  3 08:17:34 np0005544118 cloud-init[921]: Generating public/private ed25519 key pair.
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec  3 08:17:34 np0005544118 cloud-init[921]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key fingerprint is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: SHA256:BWBO9rt3EAhorWCJrY+gyUYrsnl8U8NqzZThlZ3mTps root@np0005544118.novalocal
Dec  3 08:17:34 np0005544118 cloud-init[921]: The key's randomart image is:
Dec  3 08:17:34 np0005544118 cloud-init[921]: +--[ED25519 256]--+
Dec  3 08:17:34 np0005544118 cloud-init[921]: | o . o*..        |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |. = o=.o o       |
Dec  3 08:17:34 np0005544118 cloud-init[921]: | o o .. ooo.     |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |o.  . . oo+.     |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |=+.  o +So.      |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |*+.   B  .o.     |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |++   * ..o.o.    |
Dec  3 08:17:34 np0005544118 cloud-init[921]: |o o = o  .E.     |
Dec  3 08:17:34 np0005544118 cloud-init[921]: | . o .           |
Dec  3 08:17:34 np0005544118 cloud-init[921]: +----[SHA256]-----+
Dec  3 08:17:34 np0005544118 systemd[1]: Finished Cloud-init: Network Stage.
Dec  3 08:17:34 np0005544118 systemd[1]: Reached target Cloud-config availability.
Dec  3 08:17:34 np0005544118 systemd[1]: Reached target Network is Online.
Dec  3 08:17:34 np0005544118 systemd[1]: Starting Cloud-init: Config Stage...
Dec  3 08:17:34 np0005544118 systemd[1]: Starting Crash recovery kernel arming...
Dec  3 08:17:34 np0005544118 systemd[1]: Starting Notify NFS peers of a restart...
Dec  3 08:17:34 np0005544118 systemd[1]: Starting System Logging Service...
Dec  3 08:17:34 np0005544118 systemd[1]: Starting OpenSSH server daemon...
Dec  3 08:17:34 np0005544118 sm-notify[1004]: Version 2.5.4 starting
Dec  3 08:17:34 np0005544118 systemd[1]: Starting Permit User Sessions...
Dec  3 08:17:34 np0005544118 systemd[1]: Started Notify NFS peers of a restart.
Dec  3 08:17:34 np0005544118 systemd[1]: Started OpenSSH server daemon.
Dec  3 08:17:34 np0005544118 systemd[1]: Finished Permit User Sessions.
Dec  3 08:17:34 np0005544118 systemd[1]: Started Command Scheduler.
Dec  3 08:17:34 np0005544118 systemd[1]: Started Getty on tty1.
Dec  3 08:17:34 np0005544118 rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Dec  3 08:17:34 np0005544118 rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec  3 08:17:34 np0005544118 systemd[1]: Started Serial Getty on ttyS0.
Dec  3 08:17:34 np0005544118 systemd[1]: Reached target Login Prompts.
Dec  3 08:17:34 np0005544118 systemd[1]: Started System Logging Service.
Dec  3 08:17:34 np0005544118 systemd[1]: Reached target Multi-User System.
Dec  3 08:17:34 np0005544118 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec  3 08:17:34 np0005544118 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec  3 08:17:34 np0005544118 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec  3 08:17:34 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 08:17:35 np0005544118 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Dec  3 08:17:35 np0005544118 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec  3 08:17:35 np0005544118 cloud-init[1162]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Wed, 03 Dec 2025 13:17:35 +0000. Up 10.79 seconds.
Dec  3 08:17:35 np0005544118 systemd[1]: Finished Cloud-init: Config Stage.
Dec  3 08:17:35 np0005544118 systemd[1]: Starting Cloud-init: Final Stage...
Dec  3 08:17:35 np0005544118 dracut[1283]: dracut-057-102.git20250818.el9
Dec  3 08:17:35 np0005544118 dracut[1285]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec  3 08:17:35 np0005544118 cloud-init[1310]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Wed, 03 Dec 2025 13:17:35 +0000. Up 11.19 seconds.
Dec  3 08:17:35 np0005544118 cloud-init[1355]: #############################################################
Dec  3 08:17:35 np0005544118 cloud-init[1356]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec  3 08:17:35 np0005544118 cloud-init[1361]: 256 SHA256:sHcg6FT6Bm5KxS+BR9sRFeHFet05bPrT8DvkXG7msRE root@np0005544118.novalocal (ECDSA)
Dec  3 08:17:35 np0005544118 cloud-init[1363]: 256 SHA256:BWBO9rt3EAhorWCJrY+gyUYrsnl8U8NqzZThlZ3mTps root@np0005544118.novalocal (ED25519)
Dec  3 08:17:35 np0005544118 cloud-init[1368]: 3072 SHA256:7zvgOcHLN6MFaV57LAvcJLK3g0YbXu0yX7WT1favpnM root@np0005544118.novalocal (RSA)
Dec  3 08:17:35 np0005544118 cloud-init[1369]: -----END SSH HOST KEY FINGERPRINTS-----
Dec  3 08:17:35 np0005544118 cloud-init[1370]: #############################################################
Dec  3 08:17:36 np0005544118 cloud-init[1310]: Cloud-init v. 24.4-7.el9 finished at Wed, 03 Dec 2025 13:17:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.66 seconds
Dec  3 08:17:36 np0005544118 systemd[1]: Finished Cloud-init: Final Stage.
Dec  3 08:17:36 np0005544118 systemd[1]: Reached target Cloud-init target.
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  3 08:17:36 np0005544118 dracut[1285]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: memstrack is not available
Dec  3 08:17:37 np0005544118 dracut[1285]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec  3 08:17:37 np0005544118 dracut[1285]: memstrack is not available
Dec  3 08:17:37 np0005544118 dracut[1285]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec  3 08:17:37 np0005544118 dracut[1285]: *** Including module: systemd ***
Dec  3 08:17:37 np0005544118 dracut[1285]: *** Including module: fips ***
Dec  3 08:17:38 np0005544118 dracut[1285]: *** Including module: systemd-initrd ***
Dec  3 08:17:38 np0005544118 dracut[1285]: *** Including module: i18n ***
Dec  3 08:17:38 np0005544118 dracut[1285]: *** Including module: drm ***
Dec  3 08:17:38 np0005544118 chronyd[792]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Dec  3 08:17:38 np0005544118 chronyd[792]: System clock TAI offset set to 37 seconds
Dec  3 08:17:38 np0005544118 dracut[1285]: *** Including module: prefixdevname ***
Dec  3 08:17:38 np0005544118 dracut[1285]: *** Including module: kernel-modules ***
Dec  3 08:17:39 np0005544118 kernel: block vda: the capability attribute has been deprecated.
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: kernel-modules-extra ***
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: qemu ***
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: fstab-sys ***
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: rootfs-block ***
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: terminfo ***
Dec  3 08:17:39 np0005544118 dracut[1285]: *** Including module: udev-rules ***
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 25 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 25 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 31 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 31 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 28 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 28 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 32 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 32 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 30 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 30 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 irqbalance[786]: Cannot change IRQ 29 affinity: Operation not permitted
Dec  3 08:17:40 np0005544118 irqbalance[786]: IRQ 29 affinity is now unmanaged
Dec  3 08:17:40 np0005544118 dracut[1285]: Skipping udev rule: 91-permissions.rules
Dec  3 08:17:40 np0005544118 dracut[1285]: Skipping udev rule: 80-drivers-modprobe.rules
Dec  3 08:17:40 np0005544118 dracut[1285]: *** Including module: virtiofs ***
Dec  3 08:17:40 np0005544118 dracut[1285]: *** Including module: dracut-systemd ***
Dec  3 08:17:40 np0005544118 dracut[1285]: *** Including module: usrmount ***
Dec  3 08:17:40 np0005544118 dracut[1285]: *** Including module: base ***
Dec  3 08:17:41 np0005544118 dracut[1285]: *** Including module: fs-lib ***
Dec  3 08:17:41 np0005544118 dracut[1285]: *** Including module: kdumpbase ***
Dec  3 08:17:41 np0005544118 dracut[1285]: *** Including module: microcode_ctl-fw_dir_override ***
Dec  3 08:17:41 np0005544118 dracut[1285]:  microcode_ctl module: mangling fw_dir
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec  3 08:17:41 np0005544118 dracut[1285]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec  3 08:17:42 np0005544118 dracut[1285]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec  3 08:17:42 np0005544118 dracut[1285]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec  3 08:17:42 np0005544118 dracut[1285]: *** Including module: openssl ***
Dec  3 08:17:42 np0005544118 dracut[1285]: *** Including module: shutdown ***
Dec  3 08:17:42 np0005544118 dracut[1285]: *** Including module: squash ***
Dec  3 08:17:42 np0005544118 dracut[1285]: *** Including modules done ***
Dec  3 08:17:42 np0005544118 dracut[1285]: *** Installing kernel module dependencies ***
Dec  3 08:17:43 np0005544118 dracut[1285]: *** Installing kernel module dependencies done ***
Dec  3 08:17:43 np0005544118 dracut[1285]: *** Resolving executable dependencies ***
Dec  3 08:17:43 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:17:44 np0005544118 dracut[1285]: *** Resolving executable dependencies done ***
Dec  3 08:17:44 np0005544118 dracut[1285]: *** Generating early-microcode cpio image ***
Dec  3 08:17:44 np0005544118 dracut[1285]: *** Store current command line parameters ***
Dec  3 08:17:44 np0005544118 dracut[1285]: Stored kernel commandline:
Dec  3 08:17:44 np0005544118 dracut[1285]: No dracut internal kernel commandline stored in the initramfs
Dec  3 08:17:45 np0005544118 dracut[1285]: *** Install squash loader ***
Dec  3 08:17:45 np0005544118 dracut[1285]: *** Squashing the files inside the initramfs ***
Dec  3 08:17:47 np0005544118 dracut[1285]: *** Squashing the files inside the initramfs done ***
Dec  3 08:17:47 np0005544118 dracut[1285]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec  3 08:17:47 np0005544118 dracut[1285]: *** Hardlinking files ***
Dec  3 08:17:47 np0005544118 dracut[1285]: *** Hardlinking files done ***
Dec  3 08:17:47 np0005544118 dracut[1285]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec  3 08:17:47 np0005544118 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Dec  3 08:17:47 np0005544118 kdumpctl[1018]: kdump: Starting kdump: [OK]
Dec  3 08:17:48 np0005544118 systemd[1]: Finished Crash recovery kernel arming.
Dec  3 08:17:48 np0005544118 systemd[1]: Startup finished in 1.621s (kernel) + 2.677s (initrd) + 19.370s (userspace) = 23.669s.
Dec  3 08:18:01 np0005544118 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  3 08:18:06 np0005544118 systemd[1]: Created slice User Slice of UID 1000.
Dec  3 08:18:06 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec  3 08:18:06 np0005544118 systemd-logind[795]: New session 1 of user zuul.
Dec  3 08:18:06 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec  3 08:18:06 np0005544118 systemd[1]: Starting User Manager for UID 1000...
Dec  3 08:18:06 np0005544118 systemd[4301]: Queued start job for default target Main User Target.
Dec  3 08:18:06 np0005544118 systemd[4301]: Created slice User Application Slice.
Dec  3 08:18:06 np0005544118 systemd[4301]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  3 08:18:06 np0005544118 systemd[4301]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 08:18:06 np0005544118 systemd[4301]: Reached target Paths.
Dec  3 08:18:06 np0005544118 systemd[4301]: Reached target Timers.
Dec  3 08:18:06 np0005544118 systemd[4301]: Starting D-Bus User Message Bus Socket...
Dec  3 08:18:06 np0005544118 systemd[4301]: Starting Create User's Volatile Files and Directories...
Dec  3 08:18:06 np0005544118 systemd[4301]: Finished Create User's Volatile Files and Directories.
Dec  3 08:18:06 np0005544118 systemd[4301]: Listening on D-Bus User Message Bus Socket.
Dec  3 08:18:06 np0005544118 systemd[4301]: Reached target Sockets.
Dec  3 08:18:06 np0005544118 systemd[4301]: Reached target Basic System.
Dec  3 08:18:06 np0005544118 systemd[4301]: Reached target Main User Target.
Dec  3 08:18:06 np0005544118 systemd[4301]: Startup finished in 116ms.
Dec  3 08:18:06 np0005544118 systemd[1]: Started User Manager for UID 1000.
Dec  3 08:18:06 np0005544118 systemd[1]: Started Session 1 of User zuul.
Dec  3 08:18:07 np0005544118 python3[4383]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:18:10 np0005544118 python3[4411]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:18:16 np0005544118 python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:18:17 np0005544118 python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec  3 08:18:19 np0005544118 python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMhAgWNF1dbliotqRDZdMIJK6u58F7HtC31ed3fv09LSV+dQXYPaoUU2cNSoFK2jdHLs8QnwZB2ju1mru3gbJfoo5jdGLfWXxq2OaJ6tq+y7Jz5ieXqNvA9ShTWwh9hfOrUsLQCiay96qOlUoCMrO0qcIM6PUAFWNxepGoWbOs5MjOFQJ4oRUZRGUPRzsD8MCjfVNVsC09I/GlU0cb69yngCXo0XAa729C8UYjoDGCb/72US+g9e5TPbxxZvyRenNZzkTRbP3nAl2golG3qBH9yCf6hdPwHTZxLUExOgIvVTBXR2Kc5MyDsZZvUcEpt2qffLqlnleAzx9bvTgoXA7LqKVX5MmWZNMaw9N1SfxMTWPinnn+fzGleuGYZViPKlkQGmChtCkQzQBq/zqmegqSBVyPMIa+ZrMLFajRizIa51p2y3L+/B9tyfYE70AV12X3gZfOt0o3i4bQuz/pRB0wwKekFG27NbgS7Lr4X7XAQfs8dedeG26aDCvtbQmdcqM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:19 np0005544118 python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:20 np0005544118 python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:20 np0005544118 python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764767900.1406505-230-9449031319636/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=8527df38bf8e4529ab277f119e40d44a_id_rsa follow=False checksum=dfb35e22d3269ee7046508101fdb7318907c8c34 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:21 np0005544118 python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:21 np0005544118 python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764767901.066299-274-71223985699430/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=8527df38bf8e4529ab277f119e40d44a_id_rsa.pub follow=False checksum=6823d48d2b206cbca9cb1fdc0747d461c96558ce backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:23 np0005544118 python3[4971]: ansible-ping Invoked with data=pong
Dec  3 08:18:24 np0005544118 python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:18:26 np0005544118 python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec  3 08:18:27 np0005544118 python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:27 np0005544118 python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:28 np0005544118 python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:28 np0005544118 python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:28 np0005544118 python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:29 np0005544118 python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:30 np0005544118 python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:31 np0005544118 python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:32 np0005544118 python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764767910.980934-27-36468261641681/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:32 np0005544118 python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:32 np0005544118 python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:33 np0005544118 python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:33 np0005544118 python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:33 np0005544118 python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:34 np0005544118 python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:34 np0005544118 python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:34 np0005544118 python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:34 np0005544118 python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:35 np0005544118 python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:35 np0005544118 python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:35 np0005544118 python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:35 np0005544118 python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:36 np0005544118 python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:36 np0005544118 python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:36 np0005544118 python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:37 np0005544118 python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:37 np0005544118 python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:37 np0005544118 python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:37 np0005544118 python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:38 np0005544118 python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:38 np0005544118 python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:38 np0005544118 python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:39 np0005544118 python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:39 np0005544118 python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:39 np0005544118 python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:18:42 np0005544118 python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  3 08:18:42 np0005544118 systemd[1]: Starting Time & Date Service...
Dec  3 08:18:42 np0005544118 systemd[1]: Started Time & Date Service.
Dec  3 08:18:42 np0005544118 systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Dec  3 08:18:43 np0005544118 python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:43 np0005544118 python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:43 np0005544118 python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764767923.2393448-204-137810279172973/source _original_basename=tmp6vw_xcq1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:44 np0005544118 python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:44 np0005544118 python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764767924.2381833-244-112103822835816/source _original_basename=tmpsud7oxqi follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:45 np0005544118 python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:46 np0005544118 python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764767925.5604506-307-6614440431146/source _original_basename=tmps90gmhkg follow=False checksum=c8c0add412d571e63862b10c4bf0a26f0fcae547 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:46 np0005544118 python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:18:47 np0005544118 python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:18:47 np0005544118 python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:18:48 np0005544118 python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764767927.3200145-363-15142099395369/source _original_basename=tmpz25xw4j6 follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:18:48 np0005544118 python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-d6c3-4ac6-00000000001e-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:18:49 np0005544118 python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d6c3-4ac6-00000000001f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec  3 08:18:50 np0005544118 python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:19:11 np0005544118 python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:19:12 np0005544118 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  3 08:20:11 np0005544118 systemd-logind[795]: Session 1 logged out. Waiting for processes to exit.
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec  3 08:20:14 np0005544118 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec  3 08:20:14 np0005544118 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.2857] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  3 08:20:14 np0005544118 systemd-udevd[6946]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3090] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3118] settings: (eth1): created default wired connection 'Wired connection 1'
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3121] device (eth1): carrier: link connected
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3123] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3128] policy: auto-activating connection 'Wired connection 1' (a63b2672-8230-3037-8b10-0c5a89d5ff35)
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3132] device (eth1): Activation: starting connection 'Wired connection 1' (a63b2672-8230-3037-8b10-0c5a89d5ff35)
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3133] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3135] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3139] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:20:14 np0005544118 NetworkManager[859]: <info>  [1764768014.3143] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:20:14 np0005544118 systemd[4301]: Starting Mark boot as successful...
Dec  3 08:20:14 np0005544118 systemd[4301]: Finished Mark boot as successful.
Dec  3 08:20:15 np0005544118 systemd-logind[795]: New session 3 of user zuul.
Dec  3 08:20:15 np0005544118 systemd[1]: Started Session 3 of User zuul.
Dec  3 08:20:15 np0005544118 python3[6978]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-3370-d836-000000000173-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:20:22 np0005544118 python3[7058]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:20:23 np0005544118 python3[7131]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764768022.4415565-154-50473136210077/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=f79d0eab6fe03873e368b1e6e5c9b79285061cd6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:20:23 np0005544118 python3[7181]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:20:23 np0005544118 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  3 08:20:23 np0005544118 systemd[1]: Stopped Network Manager Wait Online.
Dec  3 08:20:23 np0005544118 systemd[1]: Stopping Network Manager Wait Online...
Dec  3 08:20:23 np0005544118 systemd[1]: Stopping Network Manager...
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7559] caught SIGTERM, shutting down normally.
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7570] dhcp4 (eth0): canceled DHCP transaction
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7570] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7570] dhcp4 (eth0): state changed no lease
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7572] manager: NetworkManager state is now CONNECTING
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7676] dhcp4 (eth1): canceled DHCP transaction
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.7676] dhcp4 (eth1): state changed no lease
Dec  3 08:20:23 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:20:23 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:20:23 np0005544118 NetworkManager[859]: <info>  [1764768023.8297] exiting (success)
Dec  3 08:20:23 np0005544118 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  3 08:20:23 np0005544118 systemd[1]: Stopped Network Manager.
Dec  3 08:20:23 np0005544118 systemd[1]: NetworkManager.service: Consumed 1.118s CPU time, 9.9M memory peak.
Dec  3 08:20:23 np0005544118 systemd[1]: Starting Network Manager...
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.8841] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:0378e26f-1df1-4cca-950c-062911188078)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.8843] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.8901] manager[0x560468de8070]: monitoring kernel firmware directory '/lib/firmware'.
Dec  3 08:20:23 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 08:20:23 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9693] hostname: hostname: using hostnamed
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9694] hostname: static hostname changed from (none) to "np0005544118.novalocal"
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9698] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9704] manager[0x560468de8070]: rfkill: Wi-Fi hardware radio set enabled
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9704] manager[0x560468de8070]: rfkill: WWAN hardware radio set enabled
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9732] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9732] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9732] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9733] manager: Networking is enabled by state file
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9735] settings: Loaded settings plugin: keyfile (internal)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9738] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9761] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9769] dhcp: init: Using DHCP client 'internal'
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9771] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9775] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9786] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9798] device (lo): Activation: starting connection 'lo' (392164e0-7fb3-4d91-9407-dec18de6b483)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9806] device (eth0): carrier: link connected
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9810] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9816] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9816] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9823] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9830] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9837] device (eth1): carrier: link connected
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9840] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9845] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (a63b2672-8230-3037-8b10-0c5a89d5ff35) (indicated)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9845] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9850] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9858] device (eth1): Activation: starting connection 'Wired connection 1' (a63b2672-8230-3037-8b10-0c5a89d5ff35)
Dec  3 08:20:23 np0005544118 systemd[1]: Started Network Manager.
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9867] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9880] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9883] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9884] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9886] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9889] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9891] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9893] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9894] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9900] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9902] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9909] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9911] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9936] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9937] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9941] device (lo): Activation: successful, device activated.
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9947] dhcp4 (eth0): state changed new lease, address=38.102.83.106
Dec  3 08:20:23 np0005544118 NetworkManager[7198]: <info>  [1764768023.9951] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  3 08:20:24 np0005544118 systemd[1]: Starting Network Manager Wait Online...
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0384] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0436] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0437] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0442] manager: NetworkManager state is now CONNECTED_SITE
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0447] device (eth0): Activation: successful, device activated.
Dec  3 08:20:24 np0005544118 NetworkManager[7198]: <info>  [1764768024.0452] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  3 08:20:24 np0005544118 python3[7265]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-3370-d836-0000000000bd-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:20:34 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:20:53 np0005544118 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3391] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  3 08:21:09 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:21:09 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3669] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3672] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3679] device (eth1): Activation: successful, device activated.
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3685] manager: startup complete
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3686] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <warn>  [1764768069.3691] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3698] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 systemd[1]: Finished Network Manager Wait Online.
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3806] dhcp4 (eth1): canceled DHCP transaction
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3807] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3807] dhcp4 (eth1): state changed no lease
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3824] policy: auto-activating connection 'ci-private-network' (7005831a-9eeb-556d-afba-a4917a46e274)
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3828] device (eth1): Activation: starting connection 'ci-private-network' (7005831a-9eeb-556d-afba-a4917a46e274)
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3830] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3832] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3839] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3848] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3948] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3951] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:21:09 np0005544118 NetworkManager[7198]: <info>  [1764768069.3960] device (eth1): Activation: successful, device activated.
Dec  3 08:21:19 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:21:24 np0005544118 systemd[1]: session-3.scope: Deactivated successfully.
Dec  3 08:21:24 np0005544118 systemd[1]: session-3.scope: Consumed 1.624s CPU time.
Dec  3 08:21:24 np0005544118 systemd-logind[795]: Session 3 logged out. Waiting for processes to exit.
Dec  3 08:21:24 np0005544118 systemd-logind[795]: Removed session 3.
Dec  3 08:21:33 np0005544118 systemd-logind[795]: New session 4 of user zuul.
Dec  3 08:21:33 np0005544118 systemd[1]: Started Session 4 of User zuul.
Dec  3 08:21:34 np0005544118 python3[7376]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:21:34 np0005544118 python3[7449]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764768093.9731748-312-230815884821047/source _original_basename=tmpip5y4ykh follow=False checksum=4ce0d5b3436e5347475ee884781df60f0b225451 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:21:36 np0005544118 systemd[1]: session-4.scope: Deactivated successfully.
Dec  3 08:21:36 np0005544118 systemd-logind[795]: Session 4 logged out. Waiting for processes to exit.
Dec  3 08:21:36 np0005544118 systemd-logind[795]: Removed session 4.
Dec  3 08:23:42 np0005544118 systemd[4301]: Created slice User Background Tasks Slice.
Dec  3 08:23:42 np0005544118 systemd[4301]: Starting Cleanup of User's Temporary Files and Directories...
Dec  3 08:23:42 np0005544118 systemd[4301]: Finished Cleanup of User's Temporary Files and Directories.
Dec  3 08:27:20 np0005544118 systemd-logind[795]: New session 5 of user zuul.
Dec  3 08:27:20 np0005544118 systemd[1]: Started Session 5 of User zuul.
Dec  3 08:27:20 np0005544118 python3[7529]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-2170-170d-000000001cd7-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:21 np0005544118 python3[7558]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:21 np0005544118 python3[7584]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:21 np0005544118 python3[7610]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:22 np0005544118 python3[7636]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:22 np0005544118 python3[7662]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:23 np0005544118 python3[7740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:27:23 np0005544118 python3[7813]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764768443.1213722-494-70431029206901/source _original_basename=tmppsr5al_t follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:27:24 np0005544118 python3[7863]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 08:27:24 np0005544118 systemd[1]: Reloading.
Dec  3 08:27:24 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:27:26 np0005544118 python3[7919]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec  3 08:27:27 np0005544118 python3[7945]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:27 np0005544118 python3[7973]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:27 np0005544118 python3[8001]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:27 np0005544118 python3[8029]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:28 np0005544118 python3[8056]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-2170-170d-000000001cde-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:27:28 np0005544118 python3[8086]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec  3 08:27:31 np0005544118 systemd-logind[795]: Session 5 logged out. Waiting for processes to exit.
Dec  3 08:27:31 np0005544118 systemd[1]: session-5.scope: Deactivated successfully.
Dec  3 08:27:31 np0005544118 systemd[1]: session-5.scope: Consumed 3.972s CPU time.
Dec  3 08:27:31 np0005544118 systemd-logind[795]: Removed session 5.
Dec  3 08:27:33 np0005544118 systemd-logind[795]: New session 6 of user zuul.
Dec  3 08:27:33 np0005544118 systemd[1]: Started Session 6 of User zuul.
Dec  3 08:27:33 np0005544118 python3[8119]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec  3 08:27:51 np0005544118 kernel: SELinux:  Converting 385 SID table entries...
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:27:51 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  Converting 385 SID table entries...
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:28:02 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  Converting 385 SID table entries...
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:28:14 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:28:16 np0005544118 setsebool[8185]: The virt_use_nfs policy boolean was changed to 1 by root
Dec  3 08:28:16 np0005544118 setsebool[8185]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec  3 08:28:30 np0005544118 kernel: SELinux:  Converting 388 SID table entries...
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:28:30 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:28:48 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  3 08:28:48 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:28:48 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:28:48 np0005544118 systemd[1]: Reloading.
Dec  3 08:28:48 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:28:49 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:28:51 np0005544118 python3[10293]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d96a-c90d-00000000000b-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:28:52 np0005544118 kernel: evm: overlay not supported
Dec  3 08:28:52 np0005544118 systemd[4301]: Starting D-Bus User Message Bus...
Dec  3 08:28:52 np0005544118 dbus-broker-launch[11829]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec  3 08:28:52 np0005544118 dbus-broker-launch[11829]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec  3 08:28:52 np0005544118 systemd[4301]: Started D-Bus User Message Bus.
Dec  3 08:28:52 np0005544118 dbus-broker-lau[11829]: Ready
Dec  3 08:28:52 np0005544118 systemd[4301]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec  3 08:28:52 np0005544118 systemd[4301]: Created slice Slice /user.
Dec  3 08:28:52 np0005544118 systemd[4301]: podman-11488.scope: unit configures an IP firewall, but not running as root.
Dec  3 08:28:52 np0005544118 systemd[4301]: (This warning is only shown for the first unit using IP firewalling.)
Dec  3 08:28:52 np0005544118 systemd[4301]: Started podman-11488.scope.
Dec  3 08:28:52 np0005544118 systemd[4301]: Started podman-pause-79ef7b01.scope.
Dec  3 08:28:53 np0005544118 python3[12545]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.129.56.111:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.129.56.111:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:28:53 np0005544118 python3[12545]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec  3 08:28:53 np0005544118 systemd[1]: session-6.scope: Deactivated successfully.
Dec  3 08:28:53 np0005544118 systemd[1]: session-6.scope: Consumed 1min 10.248s CPU time.
Dec  3 08:28:53 np0005544118 systemd-logind[795]: Session 6 logged out. Waiting for processes to exit.
Dec  3 08:28:53 np0005544118 systemd-logind[795]: Removed session 6.
Dec  3 08:29:17 np0005544118 systemd-logind[795]: New session 7 of user zuul.
Dec  3 08:29:17 np0005544118 systemd[1]: Started Session 7 of User zuul.
Dec  3 08:29:18 np0005544118 python3[23886]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIRgjhTVYJS/+1Q0rzkH1/tS92cVKJ3cZj47pq9HOnPMfF+k2PEKAblNA3kAd+bs2YC4Ldt+BPAuwXKE57AxDJA= zuul@np0005544116.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:29:18 np0005544118 python3[24109]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIRgjhTVYJS/+1Q0rzkH1/tS92cVKJ3cZj47pq9HOnPMfF+k2PEKAblNA3kAd+bs2YC4Ldt+BPAuwXKE57AxDJA= zuul@np0005544116.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:29:19 np0005544118 python3[24528]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005544118.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec  3 08:29:20 np0005544118 python3[24885]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIRgjhTVYJS/+1Q0rzkH1/tS92cVKJ3cZj47pq9HOnPMfF+k2PEKAblNA3kAd+bs2YC4Ldt+BPAuwXKE57AxDJA= zuul@np0005544116.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec  3 08:29:20 np0005544118 python3[25194]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:29:21 np0005544118 python3[25510]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764768560.4455688-152-59959671802922/source _original_basename=tmpy3njk4mc follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:29:22 np0005544118 python3[25970]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Dec  3 08:29:22 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 08:29:22 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 08:29:22 np0005544118 systemd-hostnamed[26103]: Changed pretty hostname to 'compute-1'
Dec  3 08:29:22 np0005544118 systemd-hostnamed[26103]: Hostname set to <compute-1> (static)
Dec  3 08:29:22 np0005544118 NetworkManager[7198]: <info>  [1764768562.4103] hostname: static hostname changed from "np0005544118.novalocal" to "compute-1"
Dec  3 08:29:22 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:29:22 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:29:23 np0005544118 systemd[1]: session-7.scope: Deactivated successfully.
Dec  3 08:29:23 np0005544118 systemd[1]: session-7.scope: Consumed 2.253s CPU time.
Dec  3 08:29:23 np0005544118 systemd-logind[795]: Session 7 logged out. Waiting for processes to exit.
Dec  3 08:29:23 np0005544118 systemd-logind[795]: Removed session 7.
Dec  3 08:29:32 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:29:33 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:29:33 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:29:33 np0005544118 systemd[1]: man-db-cache-update.service: Consumed 51.092s CPU time.
Dec  3 08:29:33 np0005544118 systemd[1]: run-r3e8b29581a754ba78aa4c0c6cb73e8fe.service: Deactivated successfully.
Dec  3 08:29:52 np0005544118 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  3 08:32:42 np0005544118 systemd[1]: Starting Cleanup of Temporary Directories...
Dec  3 08:32:42 np0005544118 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec  3 08:32:42 np0005544118 systemd[1]: Finished Cleanup of Temporary Directories.
Dec  3 08:32:42 np0005544118 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec  3 08:34:12 np0005544118 systemd-logind[795]: New session 8 of user zuul.
Dec  3 08:34:12 np0005544118 systemd[1]: Started Session 8 of User zuul.
Dec  3 08:34:13 np0005544118 python3[30032]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:34:14 np0005544118 python3[30148]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:15 np0005544118 python3[30221]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=delorean.repo follow=False checksum=39c885eb875fd03e010d1b0454241c26b121dfb2 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:15 np0005544118 python3[30247]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:16 np0005544118 python3[30320]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:16 np0005544118 python3[30346]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:16 np0005544118 python3[30419]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:16 np0005544118 python3[30445]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:17 np0005544118 python3[30518]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:17 np0005544118 python3[30544]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:17 np0005544118 python3[30617]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:18 np0005544118 python3[30643]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:18 np0005544118 python3[30716]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:34:18 np0005544118 python3[30742]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec  3 08:34:19 np0005544118 python3[30815]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764768854.6041536-33761-36038605264643/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=6e18e2038d54303b4926db53c0b6cced515a9151 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:35:44 np0005544118 python3[30865]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:40:44 np0005544118 systemd[1]: session-8.scope: Deactivated successfully.
Dec  3 08:40:44 np0005544118 systemd[1]: session-8.scope: Consumed 5.300s CPU time.
Dec  3 08:40:44 np0005544118 systemd-logind[795]: Session 8 logged out. Waiting for processes to exit.
Dec  3 08:40:44 np0005544118 systemd-logind[795]: Removed session 8.
Dec  3 08:45:42 np0005544118 systemd[1]: Starting dnf makecache...
Dec  3 08:45:42 np0005544118 dnf[30901]: Failed determining last makecache time.
Dec  3 08:45:43 np0005544118 dnf[30901]: delorean-openstack-barbican-42b4c41831408a8e323 158 kB/s |  13 kB     00:00
Dec  3 08:45:43 np0005544118 dnf[30901]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 290 kB/s |  65 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-openstack-cinder-1c00d6490d88e436f26ef 194 kB/s |  32 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-python-stevedore-c4acc5639fd2329372142 861 kB/s | 131 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-python-cloudkitty-tests-tempest-2c80f8 466 kB/s |  32 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-os-net-config-d0cedbdb788d43e5c7551df5 8.9 MB/s | 349 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 597 kB/s |  42 kB     00:00
Dec  3 08:45:44 np0005544118 dnf[30901]: delorean-python-designate-tests-tempest-347fdbc  72 kB/s |  18 kB     00:00
Dec  3 08:45:45 np0005544118 dnf[30901]: delorean-openstack-glance-1fd12c29b339f30fe823e  69 kB/s |  18 kB     00:00
Dec  3 08:45:45 np0005544118 dnf[30901]: delorean-openstack-keystone-e4b40af0ae3698fbbbb  92 kB/s |  29 kB     00:00
Dec  3 08:45:45 np0005544118 dnf[30901]: delorean-openstack-manila-3c01b7181572c95dac462  45 kB/s |  25 kB     00:00
Dec  3 08:45:46 np0005544118 dnf[30901]: delorean-python-whitebox-neutron-tests-tempest- 705 kB/s | 154 kB     00:00
Dec  3 08:45:46 np0005544118 dnf[30901]: delorean-openstack-octavia-ba397f07a7331190208c  91 kB/s |  26 kB     00:00
Dec  3 08:45:47 np0005544118 dnf[30901]: delorean-openstack-watcher-c014f81a8647287f6dcc  19 kB/s |  16 kB     00:00
Dec  3 08:45:47 np0005544118 dnf[30901]: delorean-ansible-config_template-5ccaa22121a7ff  48 kB/s | 7.4 kB     00:00
Dec  3 08:45:47 np0005544118 dnf[30901]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 1.5 MB/s | 144 kB     00:00
Dec  3 08:45:47 np0005544118 dnf[30901]: delorean-openstack-swift-dc98a8463506ac520c469a  93 kB/s |  14 kB     00:00
Dec  3 08:45:47 np0005544118 dnf[30901]: delorean-python-tempestconf-8515371b7cceebd4282 465 kB/s |  53 kB     00:00
Dec  3 08:45:48 np0005544118 dnf[30901]: delorean-openstack-heat-ui-013accbfd179753bc3f0 686 kB/s |  96 kB     00:00
Dec  3 08:45:48 np0005544118 dnf[30901]: CentOS Stream 9 - BaseOS                         67 kB/s | 6.4 kB     00:00
Dec  3 08:45:48 np0005544118 dnf[30901]: CentOS Stream 9 - AppStream                      68 kB/s | 6.5 kB     00:00
Dec  3 08:45:48 np0005544118 dnf[30901]: CentOS Stream 9 - CRB                            73 kB/s | 6.3 kB     00:00
Dec  3 08:45:48 np0005544118 dnf[30901]: CentOS Stream 9 - Extras packages                29 kB/s | 8.3 kB     00:00
Dec  3 08:45:49 np0005544118 dnf[30901]: dlrn-antelope-testing                           1.6 MB/s | 1.1 MB     00:00
Dec  3 08:45:50 np0005544118 dnf[30901]: dlrn-antelope-build-deps                         14 MB/s | 461 kB     00:00
Dec  3 08:45:50 np0005544118 dnf[30901]: centos9-rabbitmq                                877 kB/s | 123 kB     00:00
Dec  3 08:45:50 np0005544118 dnf[30901]: centos9-storage                                  14 MB/s | 415 kB     00:00
Dec  3 08:45:50 np0005544118 dnf[30901]: centos9-opstools                                2.0 MB/s |  51 kB     00:00
Dec  3 08:45:50 np0005544118 dnf[30901]: NFV SIG OpenvSwitch                             3.8 MB/s | 456 kB     00:00
Dec  3 08:45:51 np0005544118 dnf[30901]: repo-setup-centos-appstream                      72 MB/s |  25 MB     00:00
Dec  3 08:45:59 np0005544118 dnf[30901]: repo-setup-centos-baseos                         30 MB/s | 8.8 MB     00:00
Dec  3 08:46:00 np0005544118 dnf[30901]: repo-setup-centos-highavailability               25 MB/s | 744 kB     00:00
Dec  3 08:46:01 np0005544118 dnf[30901]: repo-setup-centos-powertools                     78 MB/s | 7.3 MB     00:00
Dec  3 08:46:03 np0005544118 dnf[30901]: Extra Packages for Enterprise Linux 9 - x86_64   41 MB/s |  20 MB     00:00
Dec  3 08:46:20 np0005544118 irqbalance[786]: Cannot change IRQ 26 affinity: Operation not permitted
Dec  3 08:46:20 np0005544118 irqbalance[786]: IRQ 26 affinity is now unmanaged
Dec  3 08:46:22 np0005544118 dnf[30901]: Metadata cache created.
Dec  3 08:46:22 np0005544118 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec  3 08:46:22 np0005544118 systemd[1]: Finished dnf makecache.
Dec  3 08:46:22 np0005544118 systemd[1]: dnf-makecache.service: Consumed 32.536s CPU time.
Dec  3 08:49:56 np0005544118 systemd-logind[795]: New session 9 of user zuul.
Dec  3 08:49:56 np0005544118 systemd[1]: Started Session 9 of User zuul.
Dec  3 08:49:57 np0005544118 python3.9[31183]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:49:58 np0005544118 python3.9[31364]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:50:07 np0005544118 systemd[1]: session-9.scope: Deactivated successfully.
Dec  3 08:50:07 np0005544118 systemd[1]: session-9.scope: Consumed 8.242s CPU time.
Dec  3 08:50:07 np0005544118 systemd-logind[795]: Session 9 logged out. Waiting for processes to exit.
Dec  3 08:50:07 np0005544118 systemd-logind[795]: Removed session 9.
Dec  3 08:50:12 np0005544118 systemd-logind[795]: New session 10 of user zuul.
Dec  3 08:50:12 np0005544118 systemd[1]: Started Session 10 of User zuul.
Dec  3 08:50:13 np0005544118 python3.9[31574]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:50:14 np0005544118 systemd[1]: session-10.scope: Deactivated successfully.
Dec  3 08:50:14 np0005544118 systemd-logind[795]: Session 10 logged out. Waiting for processes to exit.
Dec  3 08:50:14 np0005544118 systemd-logind[795]: Removed session 10.
Dec  3 08:50:30 np0005544118 systemd-logind[795]: New session 11 of user zuul.
Dec  3 08:50:30 np0005544118 systemd[1]: Started Session 11 of User zuul.
Dec  3 08:50:30 np0005544118 python3.9[31756]: ansible-ansible.legacy.ping Invoked with data=pong
Dec  3 08:50:32 np0005544118 python3.9[31930]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:50:32 np0005544118 python3.9[32082]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:50:33 np0005544118 python3.9[32235]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:50:34 np0005544118 python3.9[32387]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:50:35 np0005544118 python3.9[32539]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:50:36 np0005544118 python3.9[32662]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764769834.8477893-126-14180839951006/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:50:36 np0005544118 python3.9[32814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:50:37 np0005544118 python3.9[32970]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:50:38 np0005544118 python3.9[33122]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:50:39 np0005544118 python3.9[33272]: ansible-ansible.builtin.service_facts Invoked
Dec  3 08:50:40 np0005544118 irqbalance[786]: Cannot change IRQ 27 affinity: Operation not permitted
Dec  3 08:50:40 np0005544118 irqbalance[786]: IRQ 27 affinity is now unmanaged
Dec  3 08:50:43 np0005544118 python3.9[33526]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:50:43 np0005544118 python3.9[33676]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:50:44 np0005544118 python3.9[33830]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:50:45 np0005544118 python3.9[33988]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:50:46 np0005544118 python3.9[34072]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:51:43 np0005544118 systemd[1]: Reloading.
Dec  3 08:51:43 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:51:43 np0005544118 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec  3 08:51:44 np0005544118 systemd[1]: Reloading.
Dec  3 08:51:44 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:51:44 np0005544118 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec  3 08:51:44 np0005544118 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec  3 08:51:44 np0005544118 systemd[1]: Reloading.
Dec  3 08:51:44 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:51:45 np0005544118 systemd[1]: Listening on LVM2 poll daemon socket.
Dec  3 08:51:47 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 08:51:47 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 08:51:47 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 08:53:10 np0005544118 kernel: SELinux:  Converting 2719 SID table entries...
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:53:10 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:53:10 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec  3 08:53:11 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:53:11 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:53:11 np0005544118 systemd[1]: Reloading.
Dec  3 08:53:11 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:53:11 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:53:12 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:53:12 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:53:12 np0005544118 systemd[1]: man-db-cache-update.service: Consumed 1.215s CPU time.
Dec  3 08:53:12 np0005544118 systemd[1]: run-re33bf0ea52b04ac39fc7d4b5c6b19a26.service: Deactivated successfully.
Dec  3 08:53:12 np0005544118 python3.9[35619]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:53:14 np0005544118 python3.9[35901]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec  3 08:53:15 np0005544118 python3.9[36053]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec  3 08:53:19 np0005544118 python3.9[36208]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:53:20 np0005544118 python3.9[36360]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec  3 08:53:21 np0005544118 python3.9[36512]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:53:22 np0005544118 python3.9[36664]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:53:22 np0005544118 python3.9[36787]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770001.5180063-452-236901333585887/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:53:31 np0005544118 python3.9[36939]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:53:32 np0005544118 python3.9[37091]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:53:33 np0005544118 python3.9[37244]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:53:35 np0005544118 python3.9[37396]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec  3 08:53:35 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 08:53:35 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 08:53:36 np0005544118 python3.9[37551]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 08:53:37 np0005544118 python3.9[37709]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  3 08:53:38 np0005544118 python3.9[37869]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec  3 08:53:39 np0005544118 python3.9[38022]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 08:53:40 np0005544118 python3.9[38180]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec  3 08:53:41 np0005544118 python3.9[38332]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:53:44 np0005544118 python3.9[38485]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:53:45 np0005544118 python3.9[38637]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:53:46 np0005544118 python3.9[38760]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770025.0852067-690-5929735005890/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:53:47 np0005544118 python3.9[38912]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:53:47 np0005544118 systemd[1]: Starting Load Kernel Modules...
Dec  3 08:53:47 np0005544118 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec  3 08:53:47 np0005544118 kernel: Bridge firewalling registered
Dec  3 08:53:47 np0005544118 systemd-modules-load[38916]: Inserted module 'br_netfilter'
Dec  3 08:53:47 np0005544118 systemd[1]: Finished Load Kernel Modules.
Dec  3 08:53:47 np0005544118 python3.9[39071]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:53:48 np0005544118 python3.9[39194]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770027.420521-736-3440279850598/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:53:49 np0005544118 python3.9[39346]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:53:52 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 08:53:52 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 08:53:53 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:53:53 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:53:53 np0005544118 systemd[1]: Reloading.
Dec  3 08:53:53 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:53:53 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:53:55 np0005544118 python3.9[41210]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:53:56 np0005544118 python3.9[42250]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec  3 08:53:56 np0005544118 python3.9[43065]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:53:57 np0005544118 python3.9[43363]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:53:57 np0005544118 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  3 08:53:58 np0005544118 systemd[1]: Starting Authorization Manager...
Dec  3 08:53:58 np0005544118 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  3 08:53:58 np0005544118 polkitd[43788]: Started polkitd version 0.117
Dec  3 08:53:58 np0005544118 systemd[1]: Started Authorization Manager.
Dec  3 08:53:58 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:53:58 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:53:58 np0005544118 systemd[1]: man-db-cache-update.service: Consumed 4.644s CPU time.
Dec  3 08:53:58 np0005544118 systemd[1]: run-r1a46637ad22643f59fb6cad6f00553ee.service: Deactivated successfully.
Dec  3 08:53:59 np0005544118 python3.9[43958]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:53:59 np0005544118 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec  3 08:53:59 np0005544118 systemd[1]: tuned.service: Deactivated successfully.
Dec  3 08:53:59 np0005544118 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec  3 08:53:59 np0005544118 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec  3 08:53:59 np0005544118 systemd[1]: Started Dynamic System Tuning Daemon.
Dec  3 08:53:59 np0005544118 python3.9[44120]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec  3 08:54:02 np0005544118 python3.9[44272]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:54:02 np0005544118 systemd[1]: Reloading.
Dec  3 08:54:02 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:54:03 np0005544118 python3.9[44461]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:54:03 np0005544118 systemd[1]: Reloading.
Dec  3 08:54:04 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:54:04 np0005544118 python3.9[44649]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:05 np0005544118 python3.9[44802]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:05 np0005544118 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec  3 08:54:06 np0005544118 python3.9[44955]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:08 np0005544118 python3.9[45117]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:09 np0005544118 python3.9[45270]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:54:09 np0005544118 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec  3 08:54:09 np0005544118 systemd[1]: Stopped Apply Kernel Variables.
Dec  3 08:54:09 np0005544118 systemd[1]: Stopping Apply Kernel Variables...
Dec  3 08:54:09 np0005544118 systemd[1]: Starting Apply Kernel Variables...
Dec  3 08:54:09 np0005544118 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec  3 08:54:09 np0005544118 systemd[1]: Finished Apply Kernel Variables.
Dec  3 08:54:09 np0005544118 systemd[1]: session-11.scope: Deactivated successfully.
Dec  3 08:54:09 np0005544118 systemd[1]: session-11.scope: Consumed 2min 25.668s CPU time.
Dec  3 08:54:09 np0005544118 systemd-logind[795]: Session 11 logged out. Waiting for processes to exit.
Dec  3 08:54:09 np0005544118 systemd-logind[795]: Removed session 11.
Dec  3 08:54:14 np0005544118 chronyd[792]: Selected source 138.197.164.54 (2.centos.pool.ntp.org)
Dec  3 08:54:15 np0005544118 systemd-logind[795]: New session 12 of user zuul.
Dec  3 08:54:15 np0005544118 systemd[1]: Started Session 12 of User zuul.
Dec  3 08:54:16 np0005544118 python3.9[45453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:54:17 np0005544118 python3.9[45607]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:54:19 np0005544118 python3.9[45763]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:20 np0005544118 python3.9[45914]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:54:20 np0005544118 python3.9[46070]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:54:21 np0005544118 python3.9[46154]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:54:24 np0005544118 python3.9[46307]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:54:25 np0005544118 python3.9[46478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:54:26 np0005544118 python3.9[46630]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:54:26 np0005544118 systemd[1]: var-lib-containers-storage-overlay-compat2346762044-merged.mount: Deactivated successfully.
Dec  3 08:54:27 np0005544118 podman[46631]: 2025-12-03 13:54:27.03886648 +0000 UTC m=+0.734741202 system refresh
Dec  3 08:54:27 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:54:27 np0005544118 python3.9[46792]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:54:28 np0005544118 python3.9[46915]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770067.2507992-199-254706701345481/.source.json follow=False _original_basename=podman_network_config.j2 checksum=e254b6fff08df24d1f2b098250fd3dd1d7b6fcba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:54:29 np0005544118 python3.9[47067]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:54:29 np0005544118 python3.9[47190]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770068.6481938-229-98718502164247/.source.conf follow=False _original_basename=registries.conf.j2 checksum=838e71b71d98123e85de319180ac8f7ae01d4a9a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:54:30 np0005544118 python3.9[47342]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:54:31 np0005544118 python3.9[47494]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:54:31 np0005544118 python3.9[47646]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:54:32 np0005544118 python3.9[47798]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:54:33 np0005544118 python3.9[47948]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:54:34 np0005544118 python3.9[48102]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:36 np0005544118 python3.9[48255]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:40 np0005544118 python3.9[48415]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:42 np0005544118 python3.9[48568]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:45 np0005544118 python3.9[48721]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:47 np0005544118 python3.9[48877]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:52 np0005544118 python3.9[49045]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:54:54 np0005544118 python3.9[49198]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:55:11 np0005544118 python3.9[49534]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:55:14 np0005544118 python3.9[49690]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:55:15 np0005544118 python3.9[49865]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:55:16 np0005544118 python3.9[49988]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764770114.8890786-525-118563482364833/.source.json _original_basename=.z4piif44 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:55:17 np0005544118 python3.9[50140]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:55:17 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:19 np0005544118 systemd[1]: var-lib-containers-storage-overlay-compat1767771419-lower\x2dmapped.mount: Deactivated successfully.
Dec  3 08:55:27 np0005544118 podman[50152]: 2025-12-03 13:55:27.983888601 +0000 UTC m=+10.816881598 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  3 08:55:27 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:29 np0005544118 python3.9[50450]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:55:29 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:45 np0005544118 podman[50462]: 2025-12-03 13:55:45.569470733 +0000 UTC m=+16.352813668 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 08:55:45 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:45 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:45 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:46 np0005544118 python3.9[50771]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:55:46 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:48 np0005544118 podman[50783]: 2025-12-03 13:55:48.424677994 +0000 UTC m=+1.818746093 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  3 08:55:48 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:48 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:48 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:55:49 np0005544118 python3.9[51020]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:55:49 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:13 np0005544118 podman[51033]: 2025-12-03 13:56:13.227864881 +0000 UTC m=+23.753406682 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  3 08:56:13 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:13 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:13 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:14 np0005544118 python3.9[51285]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:56:14 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:22 np0005544118 podman[51297]: 2025-12-03 13:56:22.337478671 +0000 UTC m=+8.029286638 image pull 343ba269c9fe0a56d7572c8ca328dbce002017c4dd4986f43667971dd03085c2 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec  3 08:56:22 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:22 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:22 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:23 np0005544118 python3.9[51552]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec  3 08:56:24 np0005544118 podman[51565]: 2025-12-03 13:56:24.478330357 +0000 UTC m=+1.246477953 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec  3 08:56:24 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:24 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:24 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:56:25 np0005544118 systemd-logind[795]: Session 12 logged out. Waiting for processes to exit.
Dec  3 08:56:25 np0005544118 systemd[1]: session-12.scope: Deactivated successfully.
Dec  3 08:56:25 np0005544118 systemd[1]: session-12.scope: Consumed 2min 27.334s CPU time.
Dec  3 08:56:25 np0005544118 systemd-logind[795]: Removed session 12.
Dec  3 08:56:30 np0005544118 systemd-logind[795]: New session 13 of user zuul.
Dec  3 08:56:30 np0005544118 systemd[1]: Started Session 13 of User zuul.
Dec  3 08:56:31 np0005544118 python3.9[51860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:56:32 np0005544118 python3.9[52016]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec  3 08:56:33 np0005544118 python3.9[52169]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 08:56:34 np0005544118 python3.9[52327]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  3 08:56:35 np0005544118 python3.9[52487]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:56:36 np0005544118 python3.9[52571]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 08:56:40 np0005544118 python3.9[52733]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:56:57 np0005544118 kernel: SELinux:  Converting 2732 SID table entries...
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:56:57 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:56:57 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec  3 08:56:57 np0005544118 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec  3 08:56:58 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:56:58 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:56:58 np0005544118 systemd[1]: Reloading.
Dec  3 08:56:58 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:56:58 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:56:59 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:56:59 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:56:59 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:56:59 np0005544118 systemd[1]: run-re75cd3269cb84ec68b59a2a3f73648a3.service: Deactivated successfully.
Dec  3 08:57:00 np0005544118 python3.9[53832]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 08:57:00 np0005544118 systemd[1]: Reloading.
Dec  3 08:57:01 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:57:01 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:57:01 np0005544118 systemd[1]: Starting Open vSwitch Database Unit...
Dec  3 08:57:01 np0005544118 chown[53874]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec  3 08:57:01 np0005544118 ovs-ctl[53879]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec  3 08:57:01 np0005544118 ovs-ctl[53879]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-ctl[53879]: Starting ovsdb-server [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-vsctl[53928]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec  3 08:57:01 np0005544118 ovs-vsctl[53948]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ac9297d1-94e5-43bb-91f9-3d345a639adf\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec  3 08:57:01 np0005544118 ovs-ctl[53879]: Configuring Open vSwitch system IDs [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-ctl[53879]: Enabling remote OVSDB managers [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-vsctl[53954]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  3 08:57:01 np0005544118 systemd[1]: Started Open vSwitch Database Unit.
Dec  3 08:57:01 np0005544118 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec  3 08:57:01 np0005544118 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec  3 08:57:01 np0005544118 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec  3 08:57:01 np0005544118 kernel: openvswitch: Open vSwitch switching datapath
Dec  3 08:57:01 np0005544118 ovs-ctl[53998]: Inserting openvswitch module [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-ctl[53967]: Starting ovs-vswitchd [  OK  ]
Dec  3 08:57:01 np0005544118 ovs-vsctl[54019]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Dec  3 08:57:01 np0005544118 ovs-ctl[53967]: Enabling remote OVSDB managers [  OK  ]
Dec  3 08:57:01 np0005544118 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec  3 08:57:01 np0005544118 systemd[1]: Starting Open vSwitch...
Dec  3 08:57:01 np0005544118 systemd[1]: Finished Open vSwitch.
Dec  3 08:57:02 np0005544118 python3.9[54171]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:57:03 np0005544118 python3.9[54323]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec  3 08:57:05 np0005544118 kernel: SELinux:  Converting 2746 SID table entries...
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 08:57:05 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 08:57:06 np0005544118 python3.9[54478]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:57:07 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec  3 08:57:07 np0005544118 python3.9[54636]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:57:09 np0005544118 python3.9[54789]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:57:11 np0005544118 python3.9[55076]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  3 08:57:11 np0005544118 python3.9[55226]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:57:12 np0005544118 python3.9[55380]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:57:14 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:57:14 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:57:14 np0005544118 systemd[1]: Reloading.
Dec  3 08:57:14 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:57:14 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:57:15 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:57:15 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:57:15 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:57:15 np0005544118 systemd[1]: run-r6d7f27fe688441439bc0d25ebce52f79.service: Deactivated successfully.
Dec  3 08:57:16 np0005544118 python3.9[55697]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:57:16 np0005544118 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec  3 08:57:16 np0005544118 systemd[1]: Stopped Network Manager Wait Online.
Dec  3 08:57:16 np0005544118 systemd[1]: Stopping Network Manager Wait Online...
Dec  3 08:57:16 np0005544118 systemd[1]: Stopping Network Manager...
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3077] caught SIGTERM, shutting down normally.
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3092] dhcp4 (eth0): canceled DHCP transaction
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3092] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3092] dhcp4 (eth0): state changed no lease
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3096] manager: NetworkManager state is now CONNECTED_SITE
Dec  3 08:57:16 np0005544118 NetworkManager[7198]: <info>  [1764770236.3176] exiting (success)
Dec  3 08:57:16 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:57:16 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:57:16 np0005544118 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec  3 08:57:16 np0005544118 systemd[1]: Stopped Network Manager.
Dec  3 08:57:16 np0005544118 systemd[1]: NetworkManager.service: Consumed 12.512s CPU time, 4.1M memory peak, read 0B from disk, written 35.5K to disk.
Dec  3 08:57:16 np0005544118 systemd[1]: Starting Network Manager...
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.3955] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:0378e26f-1df1-4cca-950c-062911188078)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.3957] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4021] manager[0x5591b705e090]: monitoring kernel firmware directory '/lib/firmware'.
Dec  3 08:57:16 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 08:57:16 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4867] hostname: hostname: using hostnamed
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4868] hostname: static hostname changed from (none) to "compute-1"
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4873] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4877] manager[0x5591b705e090]: rfkill: Wi-Fi hardware radio set enabled
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4877] manager[0x5591b705e090]: rfkill: WWAN hardware radio set enabled
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4896] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4904] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4905] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4906] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4906] manager: Networking is enabled by state file
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4908] settings: Loaded settings plugin: keyfile (internal)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4910] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4933] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4941] dhcp: init: Using DHCP client 'internal'
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4943] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4947] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4951] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4958] device (lo): Activation: starting connection 'lo' (392164e0-7fb3-4d91-9407-dec18de6b483)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4971] device (eth0): carrier: link connected
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4976] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4982] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4982] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4989] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.4995] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5000] device (eth1): carrier: link connected
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5004] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5009] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (7005831a-9eeb-556d-afba-a4917a46e274) (indicated)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5009] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5014] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5022] device (eth1): Activation: starting connection 'ci-private-network' (7005831a-9eeb-556d-afba-a4917a46e274)
Dec  3 08:57:16 np0005544118 systemd[1]: Started Network Manager.
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5027] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5036] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5040] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5042] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5045] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5049] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5052] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5055] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5059] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5063] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5067] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5074] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5088] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5095] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5097] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5103] device (lo): Activation: successful, device activated.
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5110] dhcp4 (eth0): state changed new lease, address=38.102.83.106
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5116] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec  3 08:57:16 np0005544118 systemd[1]: Starting Network Manager Wait Online...
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5198] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5208] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5210] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5214] manager: NetworkManager state is now CONNECTED_LOCAL
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5219] device (eth1): Activation: successful, device activated.
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5236] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5239] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5244] manager: NetworkManager state is now CONNECTED_SITE
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5248] device (eth0): Activation: successful, device activated.
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5254] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec  3 08:57:16 np0005544118 NetworkManager[55710]: <info>  [1764770236.5258] manager: startup complete
Dec  3 08:57:16 np0005544118 systemd[1]: Finished Network Manager Wait Online.
Dec  3 08:57:17 np0005544118 python3.9[55924]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:57:22 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 08:57:22 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 08:57:22 np0005544118 systemd[1]: Reloading.
Dec  3 08:57:22 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:57:22 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:57:23 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 08:57:24 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 08:57:24 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 08:57:24 np0005544118 systemd[1]: run-r67afcf29f13147eb8c4bc13863e5b0da.service: Deactivated successfully.
Dec  3 08:57:25 np0005544118 python3.9[56384]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:57:25 np0005544118 python3.9[56536]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:26 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:57:26 np0005544118 python3.9[56690]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:27 np0005544118 python3.9[56842]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:28 np0005544118 python3.9[56994]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:28 np0005544118 python3.9[57146]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:29 np0005544118 python3.9[57298]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:57:29 np0005544118 python3.9[57421]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770248.8071904-439-26930070153603/.source _original_basename=.1tk8lzss follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:30 np0005544118 python3.9[57573]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:31 np0005544118 python3.9[57725]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec  3 08:57:32 np0005544118 python3.9[57877]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:34 np0005544118 python3.9[58304]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec  3 08:57:35 np0005544118 ansible-async_wrapper.py[58479]: Invoked with j74770913244 300 /home/zuul/.ansible/tmp/ansible-tmp-1764770254.3720732-571-263399770811685/AnsiballZ_edpm_os_net_config.py _
Dec  3 08:57:35 np0005544118 ansible-async_wrapper.py[58482]: Starting module and watcher
Dec  3 08:57:35 np0005544118 ansible-async_wrapper.py[58482]: Start watching 58483 (300)
Dec  3 08:57:35 np0005544118 ansible-async_wrapper.py[58483]: Start module (58483)
Dec  3 08:57:35 np0005544118 ansible-async_wrapper.py[58479]: Return async_wrapper task started.
Dec  3 08:57:35 np0005544118 python3.9[58484]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec  3 08:57:36 np0005544118 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec  3 08:57:36 np0005544118 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec  3 08:57:36 np0005544118 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec  3 08:57:36 np0005544118 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec  3 08:57:36 np0005544118 kernel: cfg80211: failed to load regulatory.db
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.2994] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3017] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3682] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3683] audit: op="connection-add" uuid="8c56eea5-75fb-45d2-ba60-6ce97e5293a8" name="br-ex-br" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3702] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3704] audit: op="connection-add" uuid="7742645f-d293-4cc5-b5b1-80cb2477b7c9" name="br-ex-port" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3717] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3719] audit: op="connection-add" uuid="77ee6495-34b6-4d9b-a986-b6045b5c8dd9" name="eth1-port" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3731] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3732] audit: op="connection-add" uuid="21342c3c-14bc-4be6-828c-ad21b64c146e" name="vlan20-port" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3743] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3744] audit: op="connection-add" uuid="51475e2d-2763-45aa-a19b-e9eac42441d8" name="vlan21-port" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3755] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3756] audit: op="connection-add" uuid="862e116c-27ed-4f79-8834-d874de3a3c6f" name="vlan22-port" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3776] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.timestamp,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,ipv6.dhcp-timeout" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3792] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3793] audit: op="connection-add" uuid="f78b7fa7-601f-4ed5-a338-0e693ad99511" name="br-ex-if" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3865] audit: op="connection-update" uuid="7005831a-9eeb-556d-afba-a4917a46e274" name="ci-private-network" args="ipv4.routes,ipv4.routing-rules,ipv4.dns,ipv4.method,ipv4.addresses,ipv4.never-default,ovs-external-ids.data,ovs-interface.type,ipv6.routes,ipv6.routing-rules,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,connection.timestamp,connection.controller,connection.port-type,connection.master,connection.slave-type" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3886] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3888] audit: op="connection-add" uuid="99798178-1c97-4a64-b468-2a824308899f" name="vlan20-if" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3907] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3909] audit: op="connection-add" uuid="15d68699-e36f-41b1-a9d8-c1a061b69fd9" name="vlan21-if" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3926] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3928] audit: op="connection-add" uuid="2d031354-53ec-4709-8ab1-273711a6c7f3" name="vlan22-if" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3943] audit: op="connection-delete" uuid="a63b2672-8230-3037-8b10-0c5a89d5ff35" name="Wired connection 1" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3957] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3968] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3972] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (8c56eea5-75fb-45d2-ba60-6ce97e5293a8)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3973] audit: op="connection-activate" uuid="8c56eea5-75fb-45d2-ba60-6ce97e5293a8" name="br-ex-br" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3975] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3980] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3984] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (7742645f-d293-4cc5-b5b1-80cb2477b7c9)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3986] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3991] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3995] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (77ee6495-34b6-4d9b-a986-b6045b5c8dd9)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.3997] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4003] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4006] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (21342c3c-14bc-4be6-828c-ad21b64c146e)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4009] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4014] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4018] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (51475e2d-2763-45aa-a19b-e9eac42441d8)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4020] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4025] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4029] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (862e116c-27ed-4f79-8834-d874de3a3c6f)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4029] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4031] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4033] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4039] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4043] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4047] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f78b7fa7-601f-4ed5-a338-0e693ad99511)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4048] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4051] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4052] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4053] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4055] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4067] device (eth1): disconnecting for new activation request.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4069] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4071] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4073] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4074] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4076] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4080] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4083] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (99798178-1c97-4a64-b468-2a824308899f)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4084] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4087] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4089] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4090] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4092] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4096] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4101] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (15d68699-e36f-41b1-a9d8-c1a061b69fd9)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4102] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4105] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4108] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4109] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4112] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4117] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4121] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (2d031354-53ec-4709-8ab1-273711a6c7f3)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4122] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4124] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4126] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4127] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4129] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4143] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4146] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4150] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4152] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4160] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4163] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4169] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4172] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4175] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4180] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4186] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4191] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4193] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 kernel: ovs-system: entered promiscuous mode
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4199] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4204] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4208] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4211] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 kernel: Timeout policy base is empty
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4217] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 systemd-udevd[58489]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4222] dhcp4 (eth0): canceled DHCP transaction
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4222] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4223] dhcp4 (eth0): state changed no lease
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4225] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec  3 08:57:37 np0005544118 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4241] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4245] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58485 uid=0 result="fail" reason="Device is not activated"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4314] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4327] dhcp4 (eth0): state changed new lease, address=38.102.83.106
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4339] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4352] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4404] device (eth1): disconnecting for new activation request.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4405] audit: op="connection-activate" uuid="7005831a-9eeb-556d-afba-a4917a46e274" name="ci-private-network" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4419] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4557] device (eth1): Activation: starting connection 'ci-private-network' (7005831a-9eeb-556d-afba-a4917a46e274)
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4566] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4567] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4569] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4571] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4572] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4574] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4592] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4597] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4603] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4609] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4614] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4620] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4624] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4629] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4633] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4638] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4642] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4647] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4652] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4657] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4661] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58485 uid=0 result="success"
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4664] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 kernel: br-ex: entered promiscuous mode
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4689] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4696] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4746] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4748] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4754] device (eth1): Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4814] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4827] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 kernel: vlan22: entered promiscuous mode
Dec  3 08:57:37 np0005544118 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4870] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4875] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4884] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 kernel: vlan21: entered promiscuous mode
Dec  3 08:57:37 np0005544118 systemd-udevd[58491]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.4988] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5001] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5032] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5034] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5042] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 kernel: vlan20: entered promiscuous mode
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5100] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5126] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5149] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5150] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5159] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5221] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5237] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5270] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5272] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec  3 08:57:37 np0005544118 NetworkManager[55710]: <info>  [1764770257.5280] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec  3 08:57:38 np0005544118 NetworkManager[55710]: <info>  [1764770258.6653] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58485 uid=0 result="success"
Dec  3 08:57:38 np0005544118 NetworkManager[55710]: <info>  [1764770258.8587] checkpoint[0x5591b7034950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec  3 08:57:38 np0005544118 NetworkManager[55710]: <info>  [1764770258.8590] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 python3.9[58817]: ansible-ansible.legacy.async_status Invoked with jid=j74770913244.58479 mode=status _async_dir=/root/.ansible_async
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.1508] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.1527] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.3704] audit: op="networking-control" arg="global-dns-configuration" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.3752] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.3800] audit: op="networking-control" arg="global-dns-configuration" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.3834] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.5468] checkpoint[0x5591b7034a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec  3 08:57:39 np0005544118 NetworkManager[55710]: <info>  [1764770259.5472] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58485 uid=0 result="success"
Dec  3 08:57:39 np0005544118 ansible-async_wrapper.py[58483]: Module complete (58483)
Dec  3 08:57:40 np0005544118 ansible-async_wrapper.py[58482]: Done in kid B.
Dec  3 08:57:42 np0005544118 python3.9[58923]: ansible-ansible.legacy.async_status Invoked with jid=j74770913244.58479 mode=status _async_dir=/root/.ansible_async
Dec  3 08:57:42 np0005544118 python3.9[59023]: ansible-ansible.legacy.async_status Invoked with jid=j74770913244.58479 mode=cleanup _async_dir=/root/.ansible_async
Dec  3 08:57:43 np0005544118 python3.9[59175]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:57:44 np0005544118 python3.9[59298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770263.2295494-620-129025962549596/.source.returncode _original_basename=.ijlo5y70 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:45 np0005544118 python3.9[59450]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:57:45 np0005544118 python3.9[59573]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770264.5807474-652-234316117617372/.source.cfg _original_basename=.zwv54ext follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:57:46 np0005544118 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  3 08:57:46 np0005544118 python3.9[59726]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:57:46 np0005544118 systemd[1]: Reloading Network Manager...
Dec  3 08:57:46 np0005544118 NetworkManager[55710]: <info>  [1764770266.6127] audit: op="reload" arg="0" pid=59732 uid=0 result="success"
Dec  3 08:57:46 np0005544118 NetworkManager[55710]: <info>  [1764770266.6137] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec  3 08:57:46 np0005544118 systemd[1]: Reloaded Network Manager.
Dec  3 08:57:47 np0005544118 systemd[1]: session-13.scope: Deactivated successfully.
Dec  3 08:57:47 np0005544118 systemd[1]: session-13.scope: Consumed 55.023s CPU time.
Dec  3 08:57:47 np0005544118 systemd-logind[795]: Session 13 logged out. Waiting for processes to exit.
Dec  3 08:57:47 np0005544118 systemd-logind[795]: Removed session 13.
Dec  3 08:57:53 np0005544118 systemd-logind[795]: New session 14 of user zuul.
Dec  3 08:57:53 np0005544118 systemd[1]: Started Session 14 of User zuul.
Dec  3 08:57:54 np0005544118 python3.9[59917]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:57:55 np0005544118 python3.9[60071]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:57:56 np0005544118 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec  3 08:57:56 np0005544118 python3.9[60262]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:57:57 np0005544118 systemd[1]: session-14.scope: Deactivated successfully.
Dec  3 08:57:57 np0005544118 systemd-logind[795]: Session 14 logged out. Waiting for processes to exit.
Dec  3 08:57:57 np0005544118 systemd[1]: session-14.scope: Consumed 2.367s CPU time.
Dec  3 08:57:57 np0005544118 systemd-logind[795]: Removed session 14.
Dec  3 08:58:03 np0005544118 systemd-logind[795]: New session 15 of user zuul.
Dec  3 08:58:03 np0005544118 systemd[1]: Started Session 15 of User zuul.
Dec  3 08:58:04 np0005544118 python3.9[60443]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:58:05 np0005544118 python3.9[60597]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:58:06 np0005544118 python3.9[60754]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:58:07 np0005544118 python3.9[60838]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:58:09 np0005544118 python3.9[60992]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:58:10 np0005544118 python3.9[61184]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:11 np0005544118 python3.9[61336]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:58:11 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 08:58:12 np0005544118 python3.9[61498]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:13 np0005544118 python3.9[61576]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:13 np0005544118 python3.9[61728]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:14 np0005544118 python3.9[61806]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:15 np0005544118 python3.9[61958]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:15 np0005544118 python3.9[62110]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:16 np0005544118 python3.9[62262]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:17 np0005544118 python3.9[62414]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:17 np0005544118 python3.9[62566]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:58:20 np0005544118 python3.9[62719]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:58:21 np0005544118 python3.9[62873]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:58:22 np0005544118 python3.9[63025]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:58:22 np0005544118 python3.9[63177]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:58:23 np0005544118 python3.9[63330]: ansible-service_facts Invoked
Dec  3 08:58:23 np0005544118 network[63347]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 08:58:23 np0005544118 network[63348]: 'network-scripts' will be removed from distribution in near future.
Dec  3 08:58:23 np0005544118 network[63349]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 08:58:29 np0005544118 python3.9[63801]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 08:58:32 np0005544118 python3.9[63954]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec  3 08:58:33 np0005544118 python3.9[64106]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:34 np0005544118 python3.9[64231]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770312.9589958-446-226675899794445/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:34 np0005544118 python3.9[64385]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:35 np0005544118 python3.9[64510]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770314.3754985-476-137140310035151/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:36 np0005544118 python3.9[64664]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:38 np0005544118 python3.9[64818]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:58:39 np0005544118 python3.9[64902]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:58:40 np0005544118 python3.9[65056]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 08:58:41 np0005544118 python3.9[65140]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:58:41 np0005544118 chronyd[792]: chronyd exiting
Dec  3 08:58:41 np0005544118 systemd[1]: Stopping NTP client/server...
Dec  3 08:58:41 np0005544118 systemd[1]: chronyd.service: Deactivated successfully.
Dec  3 08:58:41 np0005544118 systemd[1]: Stopped NTP client/server.
Dec  3 08:58:41 np0005544118 systemd[1]: Starting NTP client/server...
Dec  3 08:58:41 np0005544118 chronyd[65150]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec  3 08:58:41 np0005544118 chronyd[65150]: Frequency -26.261 +/- 0.199 ppm read from /var/lib/chrony/drift
Dec  3 08:58:41 np0005544118 chronyd[65150]: Loaded seccomp filter (level 2)
Dec  3 08:58:41 np0005544118 systemd[1]: Started NTP client/server.
Dec  3 08:58:41 np0005544118 systemd[1]: session-15.scope: Deactivated successfully.
Dec  3 08:58:41 np0005544118 systemd[1]: session-15.scope: Consumed 26.636s CPU time.
Dec  3 08:58:41 np0005544118 systemd-logind[795]: Session 15 logged out. Waiting for processes to exit.
Dec  3 08:58:41 np0005544118 systemd-logind[795]: Removed session 15.
Dec  3 08:58:48 np0005544118 systemd-logind[795]: New session 16 of user zuul.
Dec  3 08:58:48 np0005544118 systemd[1]: Started Session 16 of User zuul.
Dec  3 08:58:49 np0005544118 python3.9[65329]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:58:50 np0005544118 python3.9[65485]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:51 np0005544118 python3.9[65660]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:51 np0005544118 python3.9[65738]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.i4panwmk recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:52 np0005544118 python3.9[65890]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:53 np0005544118 python3.9[66013]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770332.4301517-103-6657221992654/.source _original_basename=.d90ohhmz follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:54 np0005544118 python3.9[66165]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:54 np0005544118 python3.9[66317]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:55 np0005544118 python3.9[66440]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770334.363916-151-58665935712598/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:55 np0005544118 python3.9[66594]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:56 np0005544118 python3.9[66717]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770335.5249958-151-198117125648086/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 08:58:57 np0005544118 python3.9[66869]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:57 np0005544118 python3.9[67021]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:58 np0005544118 python3.9[67144]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770337.5105386-225-65628389447975/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:58:59 np0005544118 python3.9[67296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:58:59 np0005544118 python3.9[67419]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770338.717056-255-155433221606985/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:00 np0005544118 python3.9[67571]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:59:00 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:00 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:00 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:01 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:01 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:01 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:01 np0005544118 systemd[1]: Starting EDPM Container Shutdown...
Dec  3 08:59:01 np0005544118 systemd[1]: Finished EDPM Container Shutdown.
Dec  3 08:59:02 np0005544118 python3.9[67800]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:02 np0005544118 python3.9[67923]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770341.5617998-301-220688197564674/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:03 np0005544118 python3.9[68075]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:03 np0005544118 python3.9[68198]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770342.7657952-331-235785446799785/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:04 np0005544118 python3.9[68350]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:59:04 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:04 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:04 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:04 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:04 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:04 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:05 np0005544118 systemd[1]: Starting Create netns directory...
Dec  3 08:59:05 np0005544118 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  3 08:59:05 np0005544118 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  3 08:59:05 np0005544118 systemd[1]: Finished Create netns directory.
Dec  3 08:59:05 np0005544118 python3.9[68578]: ansible-ansible.builtin.service_facts Invoked
Dec  3 08:59:05 np0005544118 network[68595]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 08:59:05 np0005544118 network[68596]: 'network-scripts' will be removed from distribution in near future.
Dec  3 08:59:05 np0005544118 network[68597]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 08:59:09 np0005544118 python3.9[68859]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:59:09 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:09 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:09 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:09 np0005544118 systemd[1]: Stopping IPv4 firewall with iptables...
Dec  3 08:59:10 np0005544118 iptables.init[68899]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec  3 08:59:10 np0005544118 iptables.init[68899]: iptables: Flushing firewall rules: [  OK  ]
Dec  3 08:59:10 np0005544118 systemd[1]: iptables.service: Deactivated successfully.
Dec  3 08:59:10 np0005544118 systemd[1]: Stopped IPv4 firewall with iptables.
Dec  3 08:59:11 np0005544118 python3.9[69096]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:59:11 np0005544118 python3.9[69250]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 08:59:11 np0005544118 systemd[1]: Reloading.
Dec  3 08:59:11 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 08:59:11 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 08:59:12 np0005544118 systemd[1]: Starting Netfilter Tables...
Dec  3 08:59:12 np0005544118 systemd[1]: Finished Netfilter Tables.
Dec  3 08:59:13 np0005544118 python3.9[69443]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:14 np0005544118 python3.9[69596]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:14 np0005544118 python3.9[69721]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770353.5339334-469-110381615099831/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:15 np0005544118 python3.9[69874]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:59:15 np0005544118 systemd[1]: Reloading OpenSSH server daemon...
Dec  3 08:59:15 np0005544118 systemd[1]: Reloaded OpenSSH server daemon.
Dec  3 08:59:16 np0005544118 python3.9[70030]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:16 np0005544118 python3.9[70182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:17 np0005544118 python3.9[70305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770356.3622787-531-277836803262358/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:18 np0005544118 python3.9[70457]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec  3 08:59:18 np0005544118 systemd[1]: Starting Time & Date Service...
Dec  3 08:59:18 np0005544118 systemd[1]: Started Time & Date Service.
Dec  3 08:59:19 np0005544118 python3.9[70613]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:19 np0005544118 python3.9[70765]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:20 np0005544118 python3.9[70888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770359.3745277-601-222988656371777/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:20 np0005544118 python3.9[71040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:21 np0005544118 python3.9[71163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770360.4928331-631-242854878944371/.source.yaml _original_basename=.pmxwcea0 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:22 np0005544118 python3.9[71315]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:22 np0005544118 python3.9[71438]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770361.6430776-661-36989591960626/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:23 np0005544118 python3.9[71590]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:23 np0005544118 python3.9[71743]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:24 np0005544118 python3[71896]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  3 08:59:25 np0005544118 python3.9[72048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:26 np0005544118 python3.9[72171]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770364.996354-740-279591989095820/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:26 np0005544118 python3.9[72323]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:27 np0005544118 python3.9[72446]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770366.212945-769-192160282648630/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:27 np0005544118 python3.9[72598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:28 np0005544118 python3.9[72721]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770367.4241195-799-92417282295681/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:29 np0005544118 python3.9[72873]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:29 np0005544118 python3.9[72996]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770368.6238837-829-75024406392283/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:30 np0005544118 python3.9[73148]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 08:59:30 np0005544118 python3.9[73271]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770369.8300486-859-43066543481370/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:31 np0005544118 python3.9[73423]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:32 np0005544118 python3.9[73575]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:33 np0005544118 python3.9[73734]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:33 np0005544118 python3.9[73887]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:34 np0005544118 python3.9[74039]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:35 np0005544118 python3.9[74191]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  3 08:59:35 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 08:59:36 np0005544118 python3.9[74345]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec  3 08:59:36 np0005544118 systemd[1]: session-16.scope: Deactivated successfully.
Dec  3 08:59:36 np0005544118 systemd[1]: session-16.scope: Consumed 36.147s CPU time.
Dec  3 08:59:36 np0005544118 systemd-logind[795]: Session 16 logged out. Waiting for processes to exit.
Dec  3 08:59:36 np0005544118 systemd-logind[795]: Removed session 16.
Dec  3 08:59:41 np0005544118 systemd-logind[795]: New session 17 of user zuul.
Dec  3 08:59:41 np0005544118 systemd[1]: Started Session 17 of User zuul.
Dec  3 08:59:42 np0005544118 python3.9[74526]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec  3 08:59:43 np0005544118 python3.9[74678]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:59:44 np0005544118 python3.9[74830]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:59:45 np0005544118 python3.9[74982]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC7uKfw/1xwGVs8/AAPX9ipXj0sP0hJPCyD+QnIsKSvHQa+Us9s1nVoPc99O4X3spsXvkXQiZX+datRVOt2E+/pUuLhx+nT6bHZggohpDQ4ol0Ty8kPLlrVg+QEWgiV/Ohla1D7T5byQFbgZmuJT4takLgffRCJDRtXGLDkR2sZMOTH5TWrfNPYGwDIoIlKXb+FXF7X2JrhJJO73bAcNA39ynTmbjUI6/fIElx0hQru9sjRrIBV2x2rqyEsgjdTmcNHlLr5ylOiOZqLgG4G/BovreQhzqa1yIIC57nKPZJsgv35jNHnTaLENT2KnGHVawsDuI2O5puy7+w2yrhyAh/9BqXX1/6BYM1jcpH/lcIW5aGmDpY8AKDfCvnH1YQE5/yCumuYClWaHWVeHeJnfSECHq4Bh8j6EJ41jO6j/eSE5FQNCXATHyrHN15YWRZnybOd95/z5MN8I3P6L5tnmiErSNCFYP71H9pqyARftZWWL6blBVuilU6FvQK+3YNWF6c=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMdDPSZpj4YCnxwHR/c46xWSCufcsWzZbQcGA8CWlWhI#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMIuegYgKJAhHP4mRw80OZQZRYZnzipgINakK0nOIQWBn+GoyV1cw3k449/fV2o+seztlAz7Lb88zdOQoBPggmI=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtqpeJUEzO3zll4wr0uxjqCIuXYYX0tHkT/j+ekQ67G336RiTIiSW/0ptt7mlxVIU9fEYtdhli3GgTPOCqLWBRu51YDNUVcl1q01Z0OjhWUuMVXHKWJplYI3CQAoQlM5D1X3gjY92XBfJXgOvAZGrNDUQEgJETwdS3HPQkv2xyK6LyjGZlplL/pAuExJBBhVP+Fpj+j6/lc6sY0b1b6ymsoV0zIWEW1J3sq2pC7eydkfB9R3TUV+hLppQ0tn5NV4QGlQrZTqLcKAzXsSnHDAR0z/g24u57GfBI+Yfbq6aPU8cRgxGpuBHv1TukzSdrV+mzrK44EEN5xyWU60h+6HdqpxGK9N6UQpqRpozhSm97+cGJI19nEl19uXI8fq8cs1hNcFd9PJUHoqgytmBYLftsZkR9fY7Eqi09td3Oz5FAbjoPR8gQrMmB+6u0efPLz063LzLnvWMwm5dWBnIFkOgzf94KADtvH6rqlOYsKkTfn7tXV7hSgMtJ5ZQvGhEeMiM=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILgk15DotLDjYysP0J7npCpv6uprFE2Wsg4hpLokbk5t#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDtKKQ9sFU3b3aI5PU4sMyF0pv86qAPaBKzRI6vCkuPbSyeyy07bf1GoaFgWvJnXzsJQvdtDVi/aVjdG1hB4fS4=#012 create=True mode=0644 path=/tmp/ansible.8wi_rdt4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:46 np0005544118 python3.9[75134]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.8wi_rdt4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:46 np0005544118 python3.9[75288]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.8wi_rdt4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 08:59:47 np0005544118 systemd-logind[795]: Session 17 logged out. Waiting for processes to exit.
Dec  3 08:59:47 np0005544118 systemd[1]: session-17.scope: Deactivated successfully.
Dec  3 08:59:47 np0005544118 systemd[1]: session-17.scope: Consumed 3.415s CPU time.
Dec  3 08:59:47 np0005544118 systemd-logind[795]: Removed session 17.
Dec  3 08:59:48 np0005544118 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  3 08:59:53 np0005544118 systemd-logind[795]: New session 18 of user zuul.
Dec  3 08:59:53 np0005544118 systemd[1]: Started Session 18 of User zuul.
Dec  3 08:59:54 np0005544118 python3.9[75468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 08:59:55 np0005544118 python3.9[75624]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  3 08:59:56 np0005544118 python3.9[75778]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 08:59:57 np0005544118 python3.9[75931]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:58 np0005544118 python3.9[76084]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 08:59:58 np0005544118 python3.9[76238]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 08:59:59 np0005544118 python3.9[76393]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:00 np0005544118 systemd[1]: session-18.scope: Deactivated successfully.
Dec  3 09:00:00 np0005544118 systemd[1]: session-18.scope: Consumed 4.746s CPU time.
Dec  3 09:00:00 np0005544118 systemd-logind[795]: Session 18 logged out. Waiting for processes to exit.
Dec  3 09:00:00 np0005544118 systemd-logind[795]: Removed session 18.
Dec  3 09:00:05 np0005544118 systemd-logind[795]: New session 19 of user zuul.
Dec  3 09:00:05 np0005544118 systemd[1]: Started Session 19 of User zuul.
Dec  3 09:00:06 np0005544118 python3.9[76573]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:00:07 np0005544118 python3.9[76729]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 09:00:08 np0005544118 python3.9[76813]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec  3 09:00:11 np0005544118 python3.9[76964]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:00:12 np0005544118 python3.9[77115]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:00:13 np0005544118 python3.9[77265]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:00:13 np0005544118 python3.9[77415]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:00:14 np0005544118 systemd[1]: session-19.scope: Deactivated successfully.
Dec  3 09:00:14 np0005544118 systemd[1]: session-19.scope: Consumed 6.043s CPU time.
Dec  3 09:00:14 np0005544118 systemd-logind[795]: Session 19 logged out. Waiting for processes to exit.
Dec  3 09:00:14 np0005544118 systemd-logind[795]: Removed session 19.
Dec  3 09:00:19 np0005544118 systemd-logind[795]: New session 20 of user zuul.
Dec  3 09:00:19 np0005544118 systemd[1]: Started Session 20 of User zuul.
Dec  3 09:00:20 np0005544118 python3.9[77593]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:00:22 np0005544118 python3.9[77749]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:23 np0005544118 python3.9[77901]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:23 np0005544118 python3.9[78053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:24 np0005544118 python3.9[78176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770423.2159631-111-109244056884460/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=6f9a8534878c87b4dfbb59875d46cc2bd4239f0a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:25 np0005544118 python3.9[78328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:25 np0005544118 python3.9[78451]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770424.6768584-111-126275044413920/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=50d0c32006a11004c93fff6d565c74a4c424174e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:26 np0005544118 python3.9[78603]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:27 np0005544118 python3.9[78726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770426.0916188-111-135048554087157/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=e532f76e74f0a00443cf8ed36b1bff160b5c1cae backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:27 np0005544118 python3.9[78878]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:28 np0005544118 python3.9[79030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:29 np0005544118 python3.9[79182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:29 np0005544118 python3.9[79305]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770428.617479-235-280452620095634/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=01317766708d90a3a68b0fff82f3ea3aed2363af backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:30 np0005544118 python3.9[79457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:31 np0005544118 python3.9[79580]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770429.9494042-235-17866582074829/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=d6e208bd47b017b1e290320a71864afe6dbc3cc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:31 np0005544118 python3.9[79732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:32 np0005544118 python3.9[79855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770431.1802855-235-276214952446090/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=56a1dcd36df1622c06d6710ecc131cfe0a6a04d7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:32 np0005544118 python3.9[80007]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:33 np0005544118 python3.9[80159]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:34 np0005544118 python3.9[80311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:34 np0005544118 python3.9[80434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770433.69998-358-31684650762272/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=b072bc1c193a7a17dcb94ac34878fe8cdd752bf6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:35 np0005544118 python3.9[80586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:35 np0005544118 python3.9[80709]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770434.892619-358-16930059858390/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b73ff3b99552653b115b459f0bc86abd556e623f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:36 np0005544118 python3.9[80861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:37 np0005544118 python3.9[80984]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770436.0928566-358-71129235656376/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=785ddfcf5d4955b39842bb69e8df14755d1506e4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:37 np0005544118 python3.9[81136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:38 np0005544118 python3.9[81288]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:39 np0005544118 python3.9[81440]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:39 np0005544118 python3.9[81563]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770438.649937-471-127550611151307/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=95aa8c27ac7303a0d961b9aed86c37cf215d802e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:40 np0005544118 python3.9[81715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:40 np0005544118 python3.9[81838]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770439.7310562-471-225984585278046/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=b73ff3b99552653b115b459f0bc86abd556e623f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:41 np0005544118 python3.9[81990]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:41 np0005544118 python3.9[82113]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770440.883922-471-120230575318502/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=4e34e37007015fb0a8e440565608bb824f104460 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:43 np0005544118 python3.9[82265]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:43 np0005544118 python3.9[82417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:44 np0005544118 python3.9[82540]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770443.463716-607-9082743258206/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:45 np0005544118 python3.9[82692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:45 np0005544118 python3.9[82844]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:46 np0005544118 python3.9[82967]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770445.3509178-659-132844128273686/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:47 np0005544118 python3.9[83119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:47 np0005544118 python3.9[83271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:48 np0005544118 python3.9[83394]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770447.371837-708-60519295985665/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:49 np0005544118 python3.9[83546]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:49 np0005544118 python3.9[83698]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:50 np0005544118 python3.9[83821]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770449.2649395-758-123663801572564/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:51 np0005544118 python3.9[83973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:51 np0005544118 chronyd[65150]: Selected source 23.133.168.246 (pool.ntp.org)
Dec  3 09:00:51 np0005544118 python3.9[84125]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:52 np0005544118 python3.9[84248]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770451.19129-806-59290922739611/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:53 np0005544118 python3.9[84400]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:53 np0005544118 python3.9[84552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:54 np0005544118 python3.9[84675]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770453.2963252-854-192435682411297/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:55 np0005544118 python3.9[84827]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:00:56 np0005544118 python3.9[84979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:00:56 np0005544118 python3.9[85102]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770455.5690703-903-224114308723645/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e4ee54def979bf06b363efc8b2dc6211bb19d0e7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:00:56 np0005544118 systemd[1]: session-20.scope: Deactivated successfully.
Dec  3 09:00:56 np0005544118 systemd[1]: session-20.scope: Consumed 29.170s CPU time.
Dec  3 09:00:56 np0005544118 systemd-logind[795]: Session 20 logged out. Waiting for processes to exit.
Dec  3 09:00:57 np0005544118 systemd-logind[795]: Removed session 20.
Dec  3 09:01:03 np0005544118 systemd-logind[795]: New session 21 of user zuul.
Dec  3 09:01:03 np0005544118 systemd[1]: Started Session 21 of User zuul.
Dec  3 09:01:04 np0005544118 python3.9[85295]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:01:05 np0005544118 python3.9[85451]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:05 np0005544118 python3.9[85603]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:06 np0005544118 python3.9[85753]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:01:07 np0005544118 python3.9[85905]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  3 09:01:09 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec  3 09:01:10 np0005544118 python3.9[86061]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 09:01:10 np0005544118 python3.9[86145]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 09:01:13 np0005544118 python3.9[86298]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:01:14 np0005544118 python3[86453]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec  3 09:01:15 np0005544118 python3.9[86605]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:17 np0005544118 python3.9[86757]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:17 np0005544118 python3.9[86835]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:18 np0005544118 python3.9[86987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:18 np0005544118 python3.9[87065]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.wj4abs00 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:19 np0005544118 python3.9[87217]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:20 np0005544118 python3.9[87295]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:20 np0005544118 python3.9[87447]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:21 np0005544118 python3[87600]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  3 09:01:22 np0005544118 python3.9[87752]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:22 np0005544118 python3.9[87877]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770481.847358-295-44819767180558/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:23 np0005544118 python3.9[88029]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:24 np0005544118 python3.9[88154]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770483.1543055-325-183848443917461/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:24 np0005544118 python3.9[88306]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:25 np0005544118 python3.9[88431]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770484.3708782-355-7122674861256/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:26 np0005544118 python3.9[88583]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:26 np0005544118 python3.9[88708]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770485.627083-385-103622637835210/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:27 np0005544118 python3.9[88860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:28 np0005544118 python3.9[88985]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770486.9206746-415-7684846530804/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:28 np0005544118 python3.9[89137]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:29 np0005544118 python3.9[89289]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:30 np0005544118 python3.9[89444]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:31 np0005544118 python3.9[89598]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:31 np0005544118 python3.9[89751]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:01:32 np0005544118 python3.9[89905]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:33 np0005544118 python3.9[90060]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:34 np0005544118 python3.9[90210]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:01:35 np0005544118 python3.9[90363]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:35 np0005544118 ovs-vsctl[90364]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec  3 09:01:36 np0005544118 python3.9[90516]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:36 np0005544118 python3.9[90671]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:01:36 np0005544118 ovs-vsctl[90672]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec  3 09:01:37 np0005544118 python3.9[90822]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:01:38 np0005544118 python3.9[90976]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:38 np0005544118 python3.9[91128]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:39 np0005544118 python3.9[91206]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:39 np0005544118 python3.9[91358]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:40 np0005544118 python3.9[91436]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:41 np0005544118 python3.9[91588]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:41 np0005544118 python3.9[91740]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:42 np0005544118 python3.9[91818]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:43 np0005544118 python3.9[91970]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:43 np0005544118 python3.9[92048]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:44 np0005544118 python3.9[92200]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:01:44 np0005544118 systemd[1]: Reloading.
Dec  3 09:01:44 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:01:44 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:01:45 np0005544118 python3.9[92389]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:45 np0005544118 python3.9[92467]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:46 np0005544118 python3.9[92619]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:46 np0005544118 python3.9[92697]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:47 np0005544118 python3.9[92849]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:01:47 np0005544118 systemd[1]: Reloading.
Dec  3 09:01:47 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:01:47 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:01:47 np0005544118 systemd[1]: Starting Create netns directory...
Dec  3 09:01:47 np0005544118 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  3 09:01:47 np0005544118 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  3 09:01:47 np0005544118 systemd[1]: Finished Create netns directory.
Dec  3 09:01:48 np0005544118 python3.9[93041]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:49 np0005544118 python3.9[93193]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:49 np0005544118 python3.9[93316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770509.0337095-917-210251341693812/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:50 np0005544118 python3.9[93468]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:01:51 np0005544118 python3.9[93620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:01:52 np0005544118 python3.9[93743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770511.2196057-967-205764753483734/.source.json _original_basename=.pl949s99 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:52 np0005544118 python3.9[93895]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:01:55 np0005544118 python3.9[94322]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec  3 09:01:55 np0005544118 python3.9[94474]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:01:56 np0005544118 python3.9[94626]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  3 09:01:56 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 09:01:58 np0005544118 python3[94788]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:01:58 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 09:01:58 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 09:01:58 np0005544118 podman[94825]: 2025-12-03 14:01:58.392786083 +0000 UTC m=+0.051010731 container create 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:01:58 np0005544118 podman[94825]: 2025-12-03 14:01:58.367132944 +0000 UTC m=+0.025357622 image pull 3a37a52861b2e44ebd2a63ca2589a7c9d8e4119e5feace9d19c6312ed9b8421c quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  3 09:01:58 np0005544118 python3[94788]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec  3 09:01:59 np0005544118 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec  3 09:01:59 np0005544118 python3.9[95014]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:02:00 np0005544118 python3.9[95168]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:00 np0005544118 python3.9[95244]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:02:01 np0005544118 python3.9[95395]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764770520.678221-1143-145712005043626/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:01 np0005544118 python3.9[95471]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:02:01 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:01 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:01 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:02 np0005544118 python3.9[95581]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:02:02 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:02 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:02 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:02 np0005544118 systemd[1]: Starting ovn_controller container...
Dec  3 09:02:03 np0005544118 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec  3 09:02:03 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:02:03 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66f0d086d62665e4d129248cd7ef6f6ba10831d33ec03152fe8d5f343bebc9cc/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  3 09:02:03 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1.
Dec  3 09:02:03 np0005544118 podman[95621]: 2025-12-03 14:02:03.067776159 +0000 UTC m=+0.132411136 container init 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + sudo -E kolla_set_configs
Dec  3 09:02:03 np0005544118 podman[95621]: 2025-12-03 14:02:03.099040143 +0000 UTC m=+0.163675070 container start 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:02:03 np0005544118 systemd[1]: Created slice User Slice of UID 0.
Dec  3 09:02:03 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec  3 09:02:03 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec  3 09:02:03 np0005544118 edpm-start-podman-container[95621]: ovn_controller
Dec  3 09:02:03 np0005544118 systemd[1]: Starting User Manager for UID 0...
Dec  3 09:02:03 np0005544118 edpm-start-podman-container[95620]: Creating additional drop-in dependency for "ovn_controller" (6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1)
Dec  3 09:02:03 np0005544118 podman[95643]: 2025-12-03 14:02:03.210342069 +0000 UTC m=+0.100276941 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Dec  3 09:02:03 np0005544118 systemd[1]: 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1-6e1b20e0338233d1.service: Main process exited, code=exited, status=1/FAILURE
Dec  3 09:02:03 np0005544118 systemd[1]: 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1-6e1b20e0338233d1.service: Failed with result 'exit-code'.
Dec  3 09:02:03 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:03 np0005544118 systemd[95657]: Queued start job for default target Main User Target.
Dec  3 09:02:03 np0005544118 systemd[95657]: Created slice User Application Slice.
Dec  3 09:02:03 np0005544118 systemd[95657]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec  3 09:02:03 np0005544118 systemd[95657]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 09:02:03 np0005544118 systemd[95657]: Reached target Paths.
Dec  3 09:02:03 np0005544118 systemd[95657]: Reached target Timers.
Dec  3 09:02:03 np0005544118 systemd[95657]: Starting D-Bus User Message Bus Socket...
Dec  3 09:02:03 np0005544118 systemd[95657]: Starting Create User's Volatile Files and Directories...
Dec  3 09:02:03 np0005544118 systemd[95657]: Finished Create User's Volatile Files and Directories.
Dec  3 09:02:03 np0005544118 systemd[95657]: Listening on D-Bus User Message Bus Socket.
Dec  3 09:02:03 np0005544118 systemd[95657]: Reached target Sockets.
Dec  3 09:02:03 np0005544118 systemd[95657]: Reached target Basic System.
Dec  3 09:02:03 np0005544118 systemd[95657]: Reached target Main User Target.
Dec  3 09:02:03 np0005544118 systemd[95657]: Startup finished in 112ms.
Dec  3 09:02:03 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:03 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:03 np0005544118 systemd[1]: Started User Manager for UID 0.
Dec  3 09:02:03 np0005544118 systemd[1]: Started ovn_controller container.
Dec  3 09:02:03 np0005544118 systemd[1]: Started Session c1 of User root.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: INFO:__main__:Validating config file
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: INFO:__main__:Writing out command to execute
Dec  3 09:02:03 np0005544118 systemd[1]: session-c1.scope: Deactivated successfully.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: ++ cat /run_command
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + ARGS=
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + sudo kolla_copy_cacerts
Dec  3 09:02:03 np0005544118 systemd[1]: Started Session c2 of User root.
Dec  3 09:02:03 np0005544118 systemd[1]: session-c2.scope: Deactivated successfully.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + [[ ! -n '' ]]
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + . kolla_extend_start
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + umask 0022
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6168] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6177] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6191] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6197] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6200] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec  3 09:02:03 np0005544118 kernel: br-int: entered promiscuous mode
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00024|main|INFO|OVS feature set changed, force recompute.
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  3 09:02:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:03Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6420] manager: (ovn-d96682-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec  3 09:02:03 np0005544118 systemd-udevd[95768]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:02:03 np0005544118 kernel: genev_sys_6081: entered promiscuous mode
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6608] device (genev_sys_6081): carrier: link connected
Dec  3 09:02:03 np0005544118 systemd-udevd[95770]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.6618] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec  3 09:02:03 np0005544118 NetworkManager[55710]: <info>  [1764770523.9194] manager: (ovn-3a9d7e-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec  3 09:02:04 np0005544118 python3.9[95900]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:02:04 np0005544118 ovs-vsctl[95901]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec  3 09:02:05 np0005544118 python3.9[96053]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:02:05 np0005544118 ovs-vsctl[96055]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec  3 09:02:06 np0005544118 python3.9[96208]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:02:06 np0005544118 ovs-vsctl[96209]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec  3 09:02:06 np0005544118 systemd[1]: session-21.scope: Deactivated successfully.
Dec  3 09:02:06 np0005544118 systemd[1]: session-21.scope: Consumed 45.886s CPU time.
Dec  3 09:02:06 np0005544118 systemd-logind[795]: Session 21 logged out. Waiting for processes to exit.
Dec  3 09:02:06 np0005544118 systemd-logind[795]: Removed session 21.
Dec  3 09:02:11 np0005544118 systemd-logind[795]: New session 23 of user zuul.
Dec  3 09:02:11 np0005544118 systemd[1]: Started Session 23 of User zuul.
Dec  3 09:02:12 np0005544118 python3.9[96387]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:02:13 np0005544118 python3.9[96543]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:13 np0005544118 systemd[1]: Stopping User Manager for UID 0...
Dec  3 09:02:13 np0005544118 systemd[95657]: Activating special unit Exit the Session...
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped target Main User Target.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped target Basic System.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped target Paths.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped target Sockets.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped target Timers.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  3 09:02:13 np0005544118 systemd[95657]: Closed D-Bus User Message Bus Socket.
Dec  3 09:02:13 np0005544118 systemd[95657]: Stopped Create User's Volatile Files and Directories.
Dec  3 09:02:13 np0005544118 systemd[95657]: Removed slice User Application Slice.
Dec  3 09:02:13 np0005544118 systemd[95657]: Reached target Shutdown.
Dec  3 09:02:13 np0005544118 systemd[95657]: Finished Exit the Session.
Dec  3 09:02:13 np0005544118 systemd[95657]: Reached target Exit the Session.
Dec  3 09:02:13 np0005544118 systemd[1]: user@0.service: Deactivated successfully.
Dec  3 09:02:13 np0005544118 systemd[1]: Stopped User Manager for UID 0.
Dec  3 09:02:13 np0005544118 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec  3 09:02:13 np0005544118 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec  3 09:02:13 np0005544118 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec  3 09:02:13 np0005544118 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec  3 09:02:13 np0005544118 systemd[1]: Removed slice User Slice of UID 0.
Dec  3 09:02:14 np0005544118 python3.9[96697]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:14 np0005544118 python3.9[96849]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:15 np0005544118 python3.9[97001]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:16 np0005544118 python3.9[97153]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:17 np0005544118 python3.9[97303]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:02:17 np0005544118 python3.9[97455]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec  3 09:02:19 np0005544118 python3.9[97605]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:20 np0005544118 python3.9[97726]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770538.9045975-153-20206014740452/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:20 np0005544118 python3.9[97876]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:21 np0005544118 python3.9[97997]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770540.3771229-183-186144317818832/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:22 np0005544118 python3.9[98150]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 09:02:23 np0005544118 python3.9[98234]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 09:02:25 np0005544118 python3.9[98387]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:02:26 np0005544118 python3.9[98540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:27 np0005544118 python3.9[98661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770546.0627453-257-262489861253886/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:27 np0005544118 python3.9[98811]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:28 np0005544118 python3.9[98932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770547.2326157-257-162123893000070/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:29 np0005544118 python3.9[99082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:30 np0005544118 python3.9[99203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770548.9937997-345-204139009446155/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:30 np0005544118 python3.9[99353]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:31 np0005544118 python3.9[99474]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770550.338438-345-202034056428553/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:31 np0005544118 python3.9[99624]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:02:32 np0005544118 python3.9[99778]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:33 np0005544118 python3.9[99930]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:33Z|00025|memory|INFO|16512 kB peak resident set size after 30.0 seconds
Dec  3 09:02:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:02:33Z|00026|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec  3 09:02:33 np0005544118 podman[99980]: 2025-12-03 14:02:33.624189605 +0000 UTC m=+0.111740637 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:02:33 np0005544118 python3.9[100027]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:34 np0005544118 python3.9[100187]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:35 np0005544118 python3.9[100265]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:35 np0005544118 python3.9[100417]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:36 np0005544118 python3.9[100569]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:36 np0005544118 python3.9[100647]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:37 np0005544118 python3.9[100799]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:38 np0005544118 python3.9[100877]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:38 np0005544118 python3.9[101029]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:02:38 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:38 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:38 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:39 np0005544118 python3.9[101217]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:40 np0005544118 python3.9[101295]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:41 np0005544118 python3.9[101447]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:41 np0005544118 python3.9[101525]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:42 np0005544118 python3.9[101677]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:02:42 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:42 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:42 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:42 np0005544118 systemd[1]: Starting Create netns directory...
Dec  3 09:02:42 np0005544118 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  3 09:02:42 np0005544118 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  3 09:02:42 np0005544118 systemd[1]: Finished Create netns directory.
Dec  3 09:02:43 np0005544118 python3.9[101870]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:44 np0005544118 python3.9[102022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:45 np0005544118 python3.9[102145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770564.007773-647-242570856629858/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:45 np0005544118 python3.9[102297]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:02:46 np0005544118 python3.9[102449]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:02:47 np0005544118 python3.9[102572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770566.187208-697-259071566004567/.source.json _original_basename=.nn31479f follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:47 np0005544118 python3.9[102724]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:50 np0005544118 python3.9[103151]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec  3 09:02:51 np0005544118 python3.9[103303]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:02:52 np0005544118 python3.9[103455]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  3 09:02:53 np0005544118 python3[103633]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:02:53 np0005544118 podman[103671]: 2025-12-03 14:02:53.866968849 +0000 UTC m=+0.050946300 container create f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  3 09:02:53 np0005544118 podman[103671]: 2025-12-03 14:02:53.838690907 +0000 UTC m=+0.022668378 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:02:53 np0005544118 python3[103633]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:02:54 np0005544118 python3.9[103861]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:02:55 np0005544118 python3.9[104015]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:56 np0005544118 python3.9[104091]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:02:56 np0005544118 python3.9[104242]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764770576.1433334-873-197828351730341/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:02:57 np0005544118 python3.9[104318]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:02:57 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:57 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:57 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:58 np0005544118 python3.9[104429]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:02:58 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:58 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:58 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:58 np0005544118 systemd[1]: Starting ovn_metadata_agent container...
Dec  3 09:02:58 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:02:58 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b477e7acdb8d283ab8b03d1fec66263503ec6de554b5e16d286b831da53e05e/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec  3 09:02:58 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b477e7acdb8d283ab8b03d1fec66263503ec6de554b5e16d286b831da53e05e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:02:58 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c.
Dec  3 09:02:58 np0005544118 podman[104470]: 2025-12-03 14:02:58.689306451 +0000 UTC m=+0.193115180 container init f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + sudo -E kolla_set_configs
Dec  3 09:02:58 np0005544118 podman[104470]: 2025-12-03 14:02:58.718616338 +0000 UTC m=+0.222425047 container start f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:02:58 np0005544118 edpm-start-podman-container[104470]: ovn_metadata_agent
Dec  3 09:02:58 np0005544118 edpm-start-podman-container[104469]: Creating additional drop-in dependency for "ovn_metadata_agent" (f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c)
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Validating config file
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Copying service configuration files
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Writing out command to execute
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron
Dec  3 09:02:58 np0005544118 podman[104493]: 2025-12-03 14:02:58.783832751 +0000 UTC m=+0.054050076 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: ++ cat /run_command
Dec  3 09:02:58 np0005544118 systemd[1]: Reloading.
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + CMD=neutron-ovn-metadata-agent
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + ARGS=
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + sudo kolla_copy_cacerts
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + [[ ! -n '' ]]
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + . kolla_extend_start
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: Running command: 'neutron-ovn-metadata-agent'
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + umask 0022
Dec  3 09:02:58 np0005544118 ovn_metadata_agent[104486]: + exec neutron-ovn-metadata-agent
Dec  3 09:02:58 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:02:58 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:02:59 np0005544118 systemd[1]: Started ovn_metadata_agent container.
Dec  3 09:02:59 np0005544118 systemd[1]: session-23.scope: Deactivated successfully.
Dec  3 09:02:59 np0005544118 systemd[1]: session-23.scope: Consumed 34.839s CPU time.
Dec  3 09:02:59 np0005544118 systemd-logind[795]: Session 23 logged out. Waiting for processes to exit.
Dec  3 09:02:59 np0005544118 systemd-logind[795]: Removed session 23.
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.893 104491 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.894 104491 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.894 104491 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.894 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.894 104491 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.895 104491 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.896 104491 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.897 104491 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.898 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.899 104491 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.900 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.901 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.902 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.903 104491 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.904 104491 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.905 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.906 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.907 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.908 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.909 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.910 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.911 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.912 104491 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.913 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.914 104491 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.915 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.916 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.917 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.918 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.919 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.920 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.921 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.922 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.923 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.924 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.925 104491 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.938 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.938 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.939 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.939 104491 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.940 104491 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.956 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ac9297d1-94e5-43bb-91f9-3d345a639adf (UUID: ac9297d1-94e5-43bb-91f9-3d345a639adf) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.986 104491 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.986 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.986 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.986 104491 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.991 104491 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  3 09:03:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:00.997 104491 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.004 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ac9297d1-94e5-43bb-91f9-3d345a639adf'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], external_ids={}, name=ac9297d1-94e5-43bb-91f9-3d345a639adf, nb_cfg_timestamp=1764770531647, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.005 104491 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fe6f4b94bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.006 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.006 104491 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.006 104491 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.007 104491 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.013 104491 DEBUG oslo_service.service [-] Started child 104600 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.016 104600 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-364691'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.018 104491 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp4fjlkfw8/privsep.sock']#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.044 104600 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.045 104600 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.045 104600 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.049 104600 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.055 104600 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.063 104600 INFO eventlet.wsgi.server [-] (104600) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec  3 09:03:01 np0005544118 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.748 104491 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.749 104491 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4fjlkfw8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.591 104605 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.596 104605 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.598 104605 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.598 104605 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104605#033[00m
Dec  3 09:03:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:01.753 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[6706cee2-9550-40e5-8e15-a31ce272c501]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:03:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:02.315 104605 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:03:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:02.315 104605 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:03:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:02.315 104605 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:03:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:02.915 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c4846b-0d0b-4776-9e04-7fb60f57b941]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:03:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:02.918 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, column=external_ids, values=({'neutron:ovn-metadata-id': 'fa7605c4-93d0-513f-8844-3f1c20bd164e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.255 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.266 104491 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.266 104491 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.266 104491 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.266 104491 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.266 104491 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.267 104491 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.268 104491 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.269 104491 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.270 104491 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.271 104491 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.272 104491 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.273 104491 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.274 104491 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.275 104491 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.276 104491 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.277 104491 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.278 104491 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.279 104491 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.280 104491 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.281 104491 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.282 104491 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.283 104491 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.284 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.285 104491 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.286 104491 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.287 104491 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.288 104491 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.289 104491 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.290 104491 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.291 104491 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.292 104491 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.293 104491 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.294 104491 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.295 104491 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.296 104491 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.297 104491 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.298 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.299 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.300 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.301 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:03:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:03:03.302 104491 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  3 09:03:03 np0005544118 podman[104610]: 2025-12-03 14:03:03.854647272 +0000 UTC m=+0.082548762 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller)
Dec  3 09:03:05 np0005544118 systemd-logind[795]: New session 24 of user zuul.
Dec  3 09:03:05 np0005544118 systemd[1]: Started Session 24 of User zuul.
Dec  3 09:03:06 np0005544118 python3.9[104791]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:03:08 np0005544118 python3.9[104947]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:09 np0005544118 python3.9[105111]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:03:09 np0005544118 systemd[1]: Reloading.
Dec  3 09:03:10 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:03:10 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:03:11 np0005544118 python3.9[105295]: ansible-ansible.builtin.service_facts Invoked
Dec  3 09:03:11 np0005544118 network[105312]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 09:03:11 np0005544118 network[105313]: 'network-scripts' will be removed from distribution in near future.
Dec  3 09:03:11 np0005544118 network[105314]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 09:03:16 np0005544118 python3.9[105575]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:16 np0005544118 python3.9[105728]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:17 np0005544118 python3.9[105881]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:18 np0005544118 python3.9[106034]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:19 np0005544118 python3.9[106187]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:19 np0005544118 python3.9[106340]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:20 np0005544118 python3.9[106493]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:03:21 np0005544118 python3.9[106646]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:22 np0005544118 python3.9[106798]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:23 np0005544118 python3.9[106950]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:23 np0005544118 python3.9[107102]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:24 np0005544118 python3.9[107254]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:24 np0005544118 python3.9[107406]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:25 np0005544118 python3.9[107558]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:26 np0005544118 python3.9[107710]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:27 np0005544118 python3.9[107862]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:28 np0005544118 python3.9[108014]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:28 np0005544118 python3.9[108166]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:29 np0005544118 podman[108290]: 2025-12-03 14:03:29.074740449 +0000 UTC m=+0.064868403 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  3 09:03:29 np0005544118 python3.9[108337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:29 np0005544118 python3.9[108489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:30 np0005544118 python3.9[108641]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:03:31 np0005544118 python3.9[108793]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:32 np0005544118 python3.9[108945]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:03:33 np0005544118 python3.9[109097]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:03:33 np0005544118 systemd[1]: Reloading.
Dec  3 09:03:33 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:03:33 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:03:34 np0005544118 podman[109256]: 2025-12-03 14:03:34.136290497 +0000 UTC m=+0.114059295 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:03:34 np0005544118 python3.9[109303]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:35 np0005544118 python3.9[109464]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:35 np0005544118 python3.9[109617]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:36 np0005544118 python3.9[109770]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:37 np0005544118 python3.9[109923]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:37 np0005544118 python3.9[110076]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:38 np0005544118 python3.9[110229]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:03:39 np0005544118 python3.9[110382]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec  3 09:03:40 np0005544118 python3.9[110535]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 09:03:41 np0005544118 python3.9[110693]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  3 09:03:42 np0005544118 python3.9[110853]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 09:03:43 np0005544118 python3.9[110937]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 09:03:59 np0005544118 podman[111122]: 2025-12-03 14:03:59.843733395 +0000 UTC m=+0.055338057 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:04:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:04:00.928 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:04:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:04:00.931 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:04:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:04:00.931 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:04:04 np0005544118 podman[111143]: 2025-12-03 14:04:04.891509139 +0000 UTC m=+0.123150722 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:04:19 np0005544118 kernel: SELinux:  Converting 2758 SID table entries...
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 09:04:19 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 09:04:30 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Dec  3 09:04:30 np0005544118 podman[111184]: 2025-12-03 14:04:30.838385017 +0000 UTC m=+0.055525920 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  3 09:04:31 np0005544118 kernel: SELinux:  Converting 2758 SID table entries...
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 09:04:31 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 09:04:35 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec  3 09:04:35 np0005544118 podman[111210]: 2025-12-03 14:04:35.879761771 +0000 UTC m=+0.099580747 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  3 09:05:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:05:00.930 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:05:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:05:00.931 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:05:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:05:00.931 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:05:01 np0005544118 podman[121301]: 2025-12-03 14:05:01.836751542 +0000 UTC m=+0.057841251 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  3 09:05:06 np0005544118 podman[124567]: 2025-12-03 14:05:06.885386533 +0000 UTC m=+0.116590496 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:05:27 np0005544118 kernel: SELinux:  Converting 2759 SID table entries...
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability network_peer_controls=1
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability open_perms=1
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability extended_socket_class=1
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability always_check_network=0
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec  3 09:05:27 np0005544118 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec  3 09:05:29 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 09:05:29 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec  3 09:05:29 np0005544118 dbus-broker-launch[771]: Noticed file-system modification, trigger reload.
Dec  3 09:05:31 np0005544118 podman[128152]: 2025-12-03 14:05:31.982342422 +0000 UTC m=+0.059299043 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:05:37 np0005544118 podman[128358]: 2025-12-03 14:05:37.881108753 +0000 UTC m=+0.113217678 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:05:38 np0005544118 systemd[1]: Stopping OpenSSH server daemon...
Dec  3 09:05:38 np0005544118 systemd[1]: sshd.service: Deactivated successfully.
Dec  3 09:05:38 np0005544118 systemd[1]: Stopped OpenSSH server daemon.
Dec  3 09:05:38 np0005544118 systemd[1]: sshd.service: Consumed 2.420s CPU time, read 32.0K from disk, written 52.0K to disk.
Dec  3 09:05:38 np0005544118 systemd[1]: Stopped target sshd-keygen.target.
Dec  3 09:05:38 np0005544118 systemd[1]: Stopping sshd-keygen.target...
Dec  3 09:05:38 np0005544118 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 09:05:38 np0005544118 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 09:05:38 np0005544118 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec  3 09:05:38 np0005544118 systemd[1]: Reached target sshd-keygen.target.
Dec  3 09:05:38 np0005544118 systemd[1]: Starting OpenSSH server daemon...
Dec  3 09:05:38 np0005544118 systemd[1]: Started OpenSSH server daemon.
Dec  3 09:05:40 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 09:05:40 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 09:05:40 np0005544118 systemd[1]: Reloading.
Dec  3 09:05:40 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:05:40 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:05:41 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 09:05:49 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 09:05:49 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 09:05:49 np0005544118 systemd[1]: man-db-cache-update.service: Consumed 10.472s CPU time.
Dec  3 09:05:49 np0005544118 systemd[1]: run-rcb3b15d3916a460c872e9994676c10fb.service: Deactivated successfully.
Dec  3 09:06:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:06:00.931 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:06:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:06:00.934 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:06:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:06:00.934 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:06:02 np0005544118 podman[137533]: 2025-12-03 14:06:02.83832939 +0000 UTC m=+0.067823315 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:06:08 np0005544118 podman[137620]: 2025-12-03 14:06:08.86474923 +0000 UTC m=+0.086389483 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:06:09 np0005544118 python3.9[137708]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:06:09 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:09 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:09 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:10 np0005544118 python3.9[137898]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:06:10 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:10 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:10 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:11 np0005544118 python3.9[138089]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:06:11 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:11 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:11 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:12 np0005544118 python3.9[138279]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:06:12 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:12 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:12 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:13 np0005544118 python3.9[138468]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:13 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:13 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:13 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:14 np0005544118 python3.9[138657]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:14 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:14 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:14 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:15 np0005544118 python3.9[138847]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:15 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:15 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:15 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:16 np0005544118 python3.9[139037]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:17 np0005544118 python3.9[139192]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:17 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:17 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:17 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:19 np0005544118 python3.9[139383]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec  3 09:06:19 np0005544118 systemd[1]: Reloading.
Dec  3 09:06:19 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:06:19 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:06:19 np0005544118 systemd[1]: Listening on libvirt proxy daemon socket.
Dec  3 09:06:19 np0005544118 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec  3 09:06:20 np0005544118 python3.9[139578]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:21 np0005544118 python3.9[139733]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:22 np0005544118 python3.9[139888]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:23 np0005544118 python3.9[140043]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:23 np0005544118 python3.9[140198]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:24 np0005544118 python3.9[140353]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:25 np0005544118 python3.9[140508]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:26 np0005544118 python3.9[140663]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:27 np0005544118 python3.9[140818]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:28 np0005544118 python3.9[140973]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:28 np0005544118 python3.9[141128]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:29 np0005544118 python3.9[141283]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:30 np0005544118 python3.9[141438]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:31 np0005544118 python3.9[141593]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec  3 09:06:33 np0005544118 podman[141621]: 2025-12-03 14:06:33.82905384 +0000 UTC m=+0.061108285 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:06:36 np0005544118 python3.9[141768]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:36 np0005544118 python3.9[141920]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:37 np0005544118 python3.9[142074]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:38 np0005544118 python3.9[142226]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:38 np0005544118 python3.9[142378]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:39 np0005544118 podman[142502]: 2025-12-03 14:06:39.366094665 +0000 UTC m=+0.126085764 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  3 09:06:39 np0005544118 python3.9[142543]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:06:40 np0005544118 python3.9[142708]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:41 np0005544118 python3.9[142833]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770800.0499907-1089-31957997399511/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:42 np0005544118 python3.9[142985]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:42 np0005544118 python3.9[143110]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770801.5971534-1089-80710717021058/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:43 np0005544118 python3.9[143262]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:44 np0005544118 python3.9[143387]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770802.922769-1089-108084735762296/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:45 np0005544118 python3.9[143539]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:45 np0005544118 python3.9[143664]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770804.6259873-1089-185104521452265/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:46 np0005544118 python3.9[143816]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:47 np0005544118 python3.9[143941]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770805.864462-1089-89025873972341/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:47 np0005544118 python3.9[144093]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:48 np0005544118 python3.9[144218]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770807.235663-1089-81464544922873/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:48 np0005544118 python3.9[144370]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:49 np0005544118 python3.9[144493]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770808.4320307-1089-254912592440244/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:50 np0005544118 python3.9[144645]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:06:50 np0005544118 python3.9[144770]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764770809.6493776-1089-90077854997760/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:51 np0005544118 python3.9[144922]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec  3 09:06:52 np0005544118 python3.9[145075]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:53 np0005544118 python3.9[145227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:53 np0005544118 python3.9[145379]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:54 np0005544118 python3.9[145531]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:55 np0005544118 python3.9[145683]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:55 np0005544118 python3.9[145835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:56 np0005544118 python3.9[145987]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:57 np0005544118 python3.9[146139]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:57 np0005544118 python3.9[146291]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:58 np0005544118 python3.9[146443]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:59 np0005544118 python3.9[146595]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:06:59 np0005544118 python3.9[146747]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:00 np0005544118 python3.9[146899]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:00 np0005544118 python3.9[147051]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:07:00.934 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:07:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:07:00.935 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:07:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:07:00.936 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:07:02 np0005544118 python3.9[147203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:02 np0005544118 python3.9[147326]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770821.681119-1531-192632724675285/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:03 np0005544118 python3.9[147478]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:03 np0005544118 podman[147601]: 2025-12-03 14:07:03.991644907 +0000 UTC m=+0.053897883 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 09:07:04 np0005544118 python3.9[147602]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770822.9714463-1531-90214197154859/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:04 np0005544118 python3.9[147770]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:05 np0005544118 python3.9[147893]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770824.2486582-1531-152489862234838/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:05 np0005544118 python3.9[148045]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:06 np0005544118 python3.9[148168]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770825.3963273-1531-134407879191363/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:07 np0005544118 python3.9[148320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:07 np0005544118 python3.9[148443]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770826.5365686-1531-244377213385646/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:08 np0005544118 python3.9[148595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:08 np0005544118 python3.9[148718]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770827.7944207-1531-183333057271216/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:09 np0005544118 python3.9[148870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:09 np0005544118 podman[148922]: 2025-12-03 14:07:09.870431779 +0000 UTC m=+0.101135143 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:07:10 np0005544118 python3.9[149017]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770829.0154216-1531-42200654339035/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:10 np0005544118 python3.9[149169]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:11 np0005544118 python3.9[149292]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770830.2762244-1531-19583064325744/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:11 np0005544118 python3.9[149444]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:12 np0005544118 python3.9[149567]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770831.4668148-1531-8132104308730/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:13 np0005544118 python3.9[149719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:13 np0005544118 python3.9[149843]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770832.6377811-1531-113377535796122/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:14 np0005544118 python3.9[149996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:14 np0005544118 python3.9[150119]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770833.8686593-1531-55404953861884/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:15 np0005544118 python3.9[150273]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:15 np0005544118 python3.9[150396]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770835.0518577-1531-35084232884898/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:16 np0005544118 python3.9[150550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:17 np0005544118 python3.9[150673]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770836.15501-1531-194687838870100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:17 np0005544118 python3.9[150827]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:18 np0005544118 python3.9[150950]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770837.3821812-1531-253635486666327/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:19 np0005544118 python3.9[151102]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:07:20 np0005544118 python3.9[151257]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec  3 09:07:22 np0005544118 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec  3 09:07:22 np0005544118 python3.9[151413]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:22 np0005544118 python3.9[151565]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:23 np0005544118 python3.9[151717]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:24 np0005544118 python3.9[151869]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:24 np0005544118 python3.9[152021]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:25 np0005544118 python3.9[152173]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:26 np0005544118 python3.9[152325]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:26 np0005544118 python3.9[152477]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:27 np0005544118 python3.9[152629]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:28 np0005544118 python3.9[152781]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:29 np0005544118 python3.9[152933]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:07:29 np0005544118 systemd[1]: Reloading.
Dec  3 09:07:29 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:07:29 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:07:29 np0005544118 systemd[1]: Starting libvirt logging daemon socket...
Dec  3 09:07:29 np0005544118 systemd[1]: Listening on libvirt logging daemon socket.
Dec  3 09:07:29 np0005544118 systemd[1]: Starting libvirt logging daemon admin socket...
Dec  3 09:07:29 np0005544118 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec  3 09:07:29 np0005544118 systemd[1]: Starting libvirt logging daemon...
Dec  3 09:07:29 np0005544118 systemd[1]: Started libvirt logging daemon.
Dec  3 09:07:30 np0005544118 python3.9[153126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:07:30 np0005544118 systemd[1]: Reloading.
Dec  3 09:07:30 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:07:30 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:07:30 np0005544118 systemd[1]: Starting libvirt nodedev daemon socket...
Dec  3 09:07:30 np0005544118 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec  3 09:07:30 np0005544118 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec  3 09:07:30 np0005544118 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec  3 09:07:30 np0005544118 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec  3 09:07:30 np0005544118 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec  3 09:07:30 np0005544118 systemd[1]: Starting libvirt nodedev daemon...
Dec  3 09:07:30 np0005544118 systemd[1]: Started libvirt nodedev daemon.
Dec  3 09:07:31 np0005544118 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec  3 09:07:31 np0005544118 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec  3 09:07:31 np0005544118 python3.9[153343]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:07:31 np0005544118 systemd[1]: Reloading.
Dec  3 09:07:31 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:07:31 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:07:31 np0005544118 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec  3 09:07:31 np0005544118 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec  3 09:07:31 np0005544118 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec  3 09:07:31 np0005544118 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec  3 09:07:31 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:07:31 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:07:31 np0005544118 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec  3 09:07:31 np0005544118 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec  3 09:07:32 np0005544118 python3.9[153562]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:07:32 np0005544118 systemd[1]: Reloading.
Dec  3 09:07:32 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:07:32 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:07:32 np0005544118 setroubleshoot[153342]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a60abc19-1329-443d-8341-9e932ab32fcd
Dec  3 09:07:32 np0005544118 setroubleshoot[153342]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  3 09:07:32 np0005544118 setroubleshoot[153342]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l a60abc19-1329-443d-8341-9e932ab32fcd
Dec  3 09:07:32 np0005544118 setroubleshoot[153342]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec  3 09:07:33 np0005544118 systemd[1]: Listening on libvirt locking daemon socket.
Dec  3 09:07:33 np0005544118 systemd[1]: Starting libvirt QEMU daemon socket...
Dec  3 09:07:33 np0005544118 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec  3 09:07:33 np0005544118 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec  3 09:07:33 np0005544118 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec  3 09:07:33 np0005544118 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec  3 09:07:33 np0005544118 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec  3 09:07:33 np0005544118 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec  3 09:07:33 np0005544118 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec  3 09:07:33 np0005544118 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec  3 09:07:33 np0005544118 systemd[1]: Starting libvirt QEMU daemon...
Dec  3 09:07:33 np0005544118 systemd[1]: Started libvirt QEMU daemon.
Dec  3 09:07:35 np0005544118 podman[153749]: 2025-12-03 14:07:35.346648372 +0000 UTC m=+1.060372041 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:07:35 np0005544118 python3.9[153795]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:07:35 np0005544118 systemd[1]: Reloading.
Dec  3 09:07:35 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:07:35 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:07:36 np0005544118 systemd[1]: Starting libvirt secret daemon socket...
Dec  3 09:07:37 np0005544118 systemd[1]: Listening on libvirt secret daemon socket.
Dec  3 09:07:37 np0005544118 systemd[1]: Starting libvirt secret daemon admin socket...
Dec  3 09:07:37 np0005544118 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec  3 09:07:37 np0005544118 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec  3 09:07:37 np0005544118 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec  3 09:07:37 np0005544118 systemd[1]: Starting libvirt secret daemon...
Dec  3 09:07:37 np0005544118 systemd[1]: Started libvirt secret daemon.
Dec  3 09:07:37 np0005544118 python3.9[154008]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:39 np0005544118 python3.9[154160]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:07:40 np0005544118 podman[154284]: 2025-12-03 14:07:40.117466527 +0000 UTC m=+0.115770513 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  3 09:07:40 np0005544118 python3.9[154329]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:40 np0005544118 python3.9[154459]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770859.7107275-2221-11269521688039/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:41 np0005544118 python3.9[154611]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:42 np0005544118 python3.9[154763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:42 np0005544118 python3.9[154841]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:42 np0005544118 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec  3 09:07:42 np0005544118 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec  3 09:07:43 np0005544118 python3.9[154993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:43 np0005544118 python3.9[155071]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.8nk4txh0 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:44 np0005544118 python3.9[155223]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:45 np0005544118 python3.9[155301]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:45 np0005544118 python3.9[155454]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:07:46 np0005544118 python3[155607]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  3 09:07:47 np0005544118 python3.9[155759]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:47 np0005544118 python3.9[155837]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:48 np0005544118 python3.9[155989]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:49 np0005544118 python3.9[156067]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:49 np0005544118 python3.9[156219]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:50 np0005544118 python3.9[156297]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:50 np0005544118 python3.9[156449]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:51 np0005544118 python3.9[156527]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:52 np0005544118 python3.9[156679]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:52 np0005544118 python3.9[156804]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764770871.786852-2471-107823415742307/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:53 np0005544118 python3.9[156956]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:54 np0005544118 python3.9[157108]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:07:55 np0005544118 python3.9[157263]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:55 np0005544118 python3.9[157415]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:07:56 np0005544118 python3.9[157568]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:07:57 np0005544118 python3.9[157722]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:07:57 np0005544118 python3.9[157877]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:07:58 np0005544118 python3.9[158029]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:07:59 np0005544118 python3.9[158152]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770878.159015-2615-53792805666360/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:00 np0005544118 python3.9[158304]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:00 np0005544118 python3.9[158427]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770879.6941226-2646-142906528969702/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:08:00.934 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:08:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:08:00.935 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:08:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:08:00.935 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:08:01 np0005544118 python3.9[158579]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:02 np0005544118 python3.9[158702]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770880.9685664-2675-135901792093436/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:02 np0005544118 python3.9[158854]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:08:02 np0005544118 systemd[1]: Reloading.
Dec  3 09:08:02 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:08:02 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:08:03 np0005544118 systemd[1]: Reached target edpm_libvirt.target.
Dec  3 09:08:04 np0005544118 python3.9[159045]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec  3 09:08:04 np0005544118 systemd[1]: Reloading.
Dec  3 09:08:04 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:08:04 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:08:04 np0005544118 systemd[1]: Reloading.
Dec  3 09:08:04 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:08:04 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:08:05 np0005544118 systemd[1]: session-24.scope: Deactivated successfully.
Dec  3 09:08:05 np0005544118 systemd[1]: session-24.scope: Consumed 3min 31.547s CPU time.
Dec  3 09:08:05 np0005544118 systemd-logind[795]: Session 24 logged out. Waiting for processes to exit.
Dec  3 09:08:05 np0005544118 systemd-logind[795]: Removed session 24.
Dec  3 09:08:05 np0005544118 podman[159144]: 2025-12-03 14:08:05.680499694 +0000 UTC m=+0.081182199 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:08:10 np0005544118 podman[159167]: 2025-12-03 14:08:10.871634612 +0000 UTC m=+0.087266981 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:08:11 np0005544118 systemd-logind[795]: New session 25 of user zuul.
Dec  3 09:08:11 np0005544118 systemd[1]: Started Session 25 of User zuul.
Dec  3 09:08:12 np0005544118 python3.9[159346]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:08:13 np0005544118 python3.9[159500]: ansible-ansible.builtin.service_facts Invoked
Dec  3 09:08:13 np0005544118 network[159517]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 09:08:13 np0005544118 network[159518]: 'network-scripts' will be removed from distribution in near future.
Dec  3 09:08:13 np0005544118 network[159519]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 09:08:18 np0005544118 python3.9[159790]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec  3 09:08:19 np0005544118 python3.9[159874]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 09:08:25 np0005544118 python3.9[160027]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:08:26 np0005544118 python3.9[160179]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:08:26 np0005544118 python3.9[160332]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:08:27 np0005544118 python3.9[160484]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:08:28 np0005544118 python3.9[160637]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:29 np0005544118 python3.9[160760]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770908.017499-171-110445684336616/.source.iscsi _original_basename=.v1jscrrw follow=False checksum=0bf92c89c23f6b10e367759649bc5bb5e3c20eb8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:30 np0005544118 python3.9[160912]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:30 np0005544118 python3.9[161064]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:30 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:08:30 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:08:32 np0005544118 python3.9[161217]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:08:32 np0005544118 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec  3 09:08:34 np0005544118 python3.9[161373]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:08:34 np0005544118 systemd[1]: Reloading.
Dec  3 09:08:34 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:08:34 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:08:34 np0005544118 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  3 09:08:34 np0005544118 systemd[1]: Starting Open-iSCSI...
Dec  3 09:08:34 np0005544118 kernel: Loading iSCSI transport class v2.0-870.
Dec  3 09:08:34 np0005544118 systemd[1]: Started Open-iSCSI.
Dec  3 09:08:34 np0005544118 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec  3 09:08:34 np0005544118 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec  3 09:08:35 np0005544118 python3.9[161574]: ansible-ansible.builtin.service_facts Invoked
Dec  3 09:08:35 np0005544118 network[161591]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 09:08:35 np0005544118 network[161592]: 'network-scripts' will be removed from distribution in near future.
Dec  3 09:08:35 np0005544118 network[161593]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 09:08:36 np0005544118 podman[161599]: 2025-12-03 14:08:36.279274826 +0000 UTC m=+0.073196191 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  3 09:08:40 np0005544118 python3.9[161885]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  3 09:08:41 np0005544118 podman[162009]: 2025-12-03 14:08:41.56251934 +0000 UTC m=+0.086393329 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:08:41 np0005544118 python3.9[162057]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec  3 09:08:42 np0005544118 python3.9[162220]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:42 np0005544118 python3.9[162343]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770921.999137-325-79118051179410/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:43 np0005544118 python3.9[162495]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:44 np0005544118 python3.9[162647]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:08:44 np0005544118 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  3 09:08:44 np0005544118 systemd[1]: Stopped Load Kernel Modules.
Dec  3 09:08:44 np0005544118 systemd[1]: Stopping Load Kernel Modules...
Dec  3 09:08:44 np0005544118 systemd[1]: Starting Load Kernel Modules...
Dec  3 09:08:44 np0005544118 systemd[1]: Finished Load Kernel Modules.
Dec  3 09:08:45 np0005544118 python3.9[162803]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:08:46 np0005544118 python3.9[162955]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:08:47 np0005544118 python3.9[163107]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:08:47 np0005544118 python3.9[163259]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:48 np0005544118 python3.9[163382]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770927.359169-441-116275475044415/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:49 np0005544118 python3.9[163534]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:08:49 np0005544118 python3.9[163687]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:50 np0005544118 python3.9[163839]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:51 np0005544118 python3.9[163991]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:52 np0005544118 python3.9[164143]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:52 np0005544118 python3.9[164295]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:53 np0005544118 python3.9[164447]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:53 np0005544118 python3.9[164599]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:54 np0005544118 python3.9[164751]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:08:55 np0005544118 python3.9[164905]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:08:56 np0005544118 python3.9[165057]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:08:57 np0005544118 python3.9[165209]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:57 np0005544118 python3.9[165287]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:08:58 np0005544118 python3.9[165439]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:08:58 np0005544118 python3.9[165517]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:08:59 np0005544118 python3.9[165669]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:00 np0005544118 python3.9[165821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:00 np0005544118 python3.9[165899]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:09:00.934 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:09:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:09:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:09:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:09:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:09:01 np0005544118 python3.9[166051]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:01 np0005544118 python3.9[166129]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:02 np0005544118 python3.9[166281]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:02 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:02 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:02 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:03 np0005544118 python3.9[166470]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:04 np0005544118 python3.9[166548]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:05 np0005544118 python3.9[166700]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:05 np0005544118 python3.9[166778]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:06 np0005544118 podman[166930]: 2025-12-03 14:09:06.420201307 +0000 UTC m=+0.081427138 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  3 09:09:06 np0005544118 python3.9[166931]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:06 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:06 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:06 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:06 np0005544118 systemd[1]: Starting Create netns directory...
Dec  3 09:09:06 np0005544118 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec  3 09:09:06 np0005544118 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec  3 09:09:06 np0005544118 systemd[1]: Finished Create netns directory.
Dec  3 09:09:07 np0005544118 python3.9[167143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:09:08 np0005544118 python3.9[167295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:09 np0005544118 python3.9[167418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764770948.1948955-855-164098362573823/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:09:10 np0005544118 python3.9[167570]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:09:11 np0005544118 python3.9[167722]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:11 np0005544118 python3.9[167845]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770950.7368495-905-195471130189831/.source.json _original_basename=.xv9pgx_i follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:11 np0005544118 podman[167870]: 2025-12-03 14:09:11.870168987 +0000 UTC m=+0.099250058 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:09:12 np0005544118 python3.9[168024]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:14 np0005544118 python3.9[168453]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec  3 09:09:15 np0005544118 python3.9[168605]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:09:16 np0005544118 python3.9[168757]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec  3 09:09:18 np0005544118 python3[168935]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:09:18 np0005544118 podman[168974]: 2025-12-03 14:09:18.627479708 +0000 UTC m=+0.034236405 image pull 9af6aa52ee187025bc25565b66d3eefb486acac26f9281e33f4cce76a40d21f7 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  3 09:09:18 np0005544118 podman[168974]: 2025-12-03 14:09:18.856457246 +0000 UTC m=+0.263213963 container create 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:09:18 np0005544118 python3[168935]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec  3 09:09:19 np0005544118 python3.9[169164]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:09:20 np0005544118 python3.9[169318]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:20 np0005544118 python3.9[169394]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:09:21 np0005544118 python3.9[169545]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764770960.9557793-1081-195168359896881/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:22 np0005544118 python3.9[169621]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:09:22 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:22 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:22 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:23 np0005544118 python3.9[169732]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:23 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:23 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:23 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:23 np0005544118 systemd[1]: Starting multipathd container...
Dec  3 09:09:23 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:09:23 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97c6cb6ce483f7df9443217cd7b761e1d024874594bd865997cd350605b18254/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  3 09:09:23 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97c6cb6ce483f7df9443217cd7b761e1d024874594bd865997cd350605b18254/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  3 09:09:23 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.
Dec  3 09:09:23 np0005544118 podman[169773]: 2025-12-03 14:09:23.682206405 +0000 UTC m=+0.249027699 container init 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:09:23 np0005544118 multipathd[169789]: + sudo -E kolla_set_configs
Dec  3 09:09:23 np0005544118 podman[169773]: 2025-12-03 14:09:23.722544998 +0000 UTC m=+0.289366272 container start 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:09:23 np0005544118 podman[169773]: multipathd
Dec  3 09:09:23 np0005544118 systemd[1]: Started multipathd container.
Dec  3 09:09:23 np0005544118 multipathd[169789]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:09:23 np0005544118 multipathd[169789]: INFO:__main__:Validating config file
Dec  3 09:09:23 np0005544118 multipathd[169789]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:09:23 np0005544118 multipathd[169789]: INFO:__main__:Writing out command to execute
Dec  3 09:09:23 np0005544118 multipathd[169789]: ++ cat /run_command
Dec  3 09:09:23 np0005544118 multipathd[169789]: + CMD='/usr/sbin/multipathd -d'
Dec  3 09:09:23 np0005544118 multipathd[169789]: + ARGS=
Dec  3 09:09:23 np0005544118 multipathd[169789]: + sudo kolla_copy_cacerts
Dec  3 09:09:23 np0005544118 multipathd[169789]: + [[ ! -n '' ]]
Dec  3 09:09:23 np0005544118 multipathd[169789]: + . kolla_extend_start
Dec  3 09:09:23 np0005544118 multipathd[169789]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  3 09:09:23 np0005544118 multipathd[169789]: Running command: '/usr/sbin/multipathd -d'
Dec  3 09:09:23 np0005544118 multipathd[169789]: + umask 0022
Dec  3 09:09:23 np0005544118 multipathd[169789]: + exec /usr/sbin/multipathd -d
Dec  3 09:09:23 np0005544118 multipathd[169789]: 3119.476110 | --------start up--------
Dec  3 09:09:23 np0005544118 multipathd[169789]: 3119.476132 | read /etc/multipath.conf
Dec  3 09:09:23 np0005544118 multipathd[169789]: 3119.482670 | path checkers start up
Dec  3 09:09:23 np0005544118 podman[169796]: 2025-12-03 14:09:23.824120587 +0000 UTC m=+0.088045123 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec  3 09:09:23 np0005544118 systemd[1]: 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-7e80b6f7145027c4.service: Main process exited, code=exited, status=1/FAILURE
Dec  3 09:09:23 np0005544118 systemd[1]: 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-7e80b6f7145027c4.service: Failed with result 'exit-code'.
Dec  3 09:09:24 np0005544118 python3.9[169980]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:09:25 np0005544118 python3.9[170135]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:09:26 np0005544118 python3.9[170300]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:09:26 np0005544118 systemd[1]: Stopping multipathd container...
Dec  3 09:09:26 np0005544118 multipathd[169789]: 3121.954338 | exit (signal)
Dec  3 09:09:26 np0005544118 multipathd[169789]: 3121.955362 | --------shut down-------
Dec  3 09:09:26 np0005544118 systemd[1]: libpod-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope: Deactivated successfully.
Dec  3 09:09:26 np0005544118 conmon[169789]: conmon 6dfd51a6c41f9acb8b85 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope/container/memory.events
Dec  3 09:09:26 np0005544118 podman[170304]: 2025-12-03 14:09:26.324338925 +0000 UTC m=+0.081818849 container died 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true)
Dec  3 09:09:26 np0005544118 systemd[1]: 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-7e80b6f7145027c4.timer: Deactivated successfully.
Dec  3 09:09:26 np0005544118 systemd[1]: Stopped /usr/bin/podman healthcheck run 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.
Dec  3 09:09:26 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-userdata-shm.mount: Deactivated successfully.
Dec  3 09:09:26 np0005544118 systemd[1]: var-lib-containers-storage-overlay-97c6cb6ce483f7df9443217cd7b761e1d024874594bd865997cd350605b18254-merged.mount: Deactivated successfully.
Dec  3 09:09:26 np0005544118 podman[170304]: 2025-12-03 14:09:26.503014887 +0000 UTC m=+0.260494811 container cleanup 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.schema-version=1.0)
Dec  3 09:09:26 np0005544118 podman[170304]: multipathd
Dec  3 09:09:26 np0005544118 podman[170332]: multipathd
Dec  3 09:09:26 np0005544118 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec  3 09:09:26 np0005544118 systemd[1]: Stopped multipathd container.
Dec  3 09:09:26 np0005544118 systemd[1]: Starting multipathd container...
Dec  3 09:09:26 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:09:26 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97c6cb6ce483f7df9443217cd7b761e1d024874594bd865997cd350605b18254/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  3 09:09:26 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97c6cb6ce483f7df9443217cd7b761e1d024874594bd865997cd350605b18254/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  3 09:09:26 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.
Dec  3 09:09:26 np0005544118 podman[170346]: 2025-12-03 14:09:26.70827124 +0000 UTC m=+0.115357773 container init 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:09:26 np0005544118 multipathd[170361]: + sudo -E kolla_set_configs
Dec  3 09:09:26 np0005544118 podman[170346]: 2025-12-03 14:09:26.737037008 +0000 UTC m=+0.144123511 container start 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec  3 09:09:26 np0005544118 podman[170346]: multipathd
Dec  3 09:09:26 np0005544118 systemd[1]: Started multipathd container.
Dec  3 09:09:26 np0005544118 multipathd[170361]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:09:26 np0005544118 multipathd[170361]: INFO:__main__:Validating config file
Dec  3 09:09:26 np0005544118 multipathd[170361]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:09:26 np0005544118 multipathd[170361]: INFO:__main__:Writing out command to execute
Dec  3 09:09:26 np0005544118 multipathd[170361]: ++ cat /run_command
Dec  3 09:09:26 np0005544118 multipathd[170361]: + CMD='/usr/sbin/multipathd -d'
Dec  3 09:09:26 np0005544118 multipathd[170361]: + ARGS=
Dec  3 09:09:26 np0005544118 multipathd[170361]: + sudo kolla_copy_cacerts
Dec  3 09:09:26 np0005544118 podman[170368]: 2025-12-03 14:09:26.800114293 +0000 UTC m=+0.053443402 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:09:26 np0005544118 systemd[1]: 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-32758c52b173e786.service: Main process exited, code=exited, status=1/FAILURE
Dec  3 09:09:26 np0005544118 systemd[1]: 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115-32758c52b173e786.service: Failed with result 'exit-code'.
Dec  3 09:09:26 np0005544118 multipathd[170361]: + [[ ! -n '' ]]
Dec  3 09:09:26 np0005544118 multipathd[170361]: + . kolla_extend_start
Dec  3 09:09:26 np0005544118 multipathd[170361]: Running command: '/usr/sbin/multipathd -d'
Dec  3 09:09:26 np0005544118 multipathd[170361]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec  3 09:09:26 np0005544118 multipathd[170361]: + umask 0022
Dec  3 09:09:26 np0005544118 multipathd[170361]: + exec /usr/sbin/multipathd -d
Dec  3 09:09:26 np0005544118 multipathd[170361]: 3122.491691 | --------start up--------
Dec  3 09:09:26 np0005544118 multipathd[170361]: 3122.491710 | read /etc/multipath.conf
Dec  3 09:09:26 np0005544118 multipathd[170361]: 3122.497433 | path checkers start up
Dec  3 09:09:27 np0005544118 python3.9[170552]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:28 np0005544118 python3.9[170706]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec  3 09:09:29 np0005544118 python3.9[170858]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec  3 09:09:29 np0005544118 kernel: Key type psk registered
Dec  3 09:09:29 np0005544118 python3.9[171021]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:09:30 np0005544118 python3.9[171146]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764770969.3145924-1241-95710201864828/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:30 np0005544118 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec  3 09:09:31 np0005544118 python3.9[171299]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:31 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:09:31 np0005544118 python3.9[171451]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:09:32 np0005544118 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec  3 09:09:32 np0005544118 systemd[1]: Stopped Load Kernel Modules.
Dec  3 09:09:32 np0005544118 systemd[1]: Stopping Load Kernel Modules...
Dec  3 09:09:32 np0005544118 systemd[1]: Starting Load Kernel Modules...
Dec  3 09:09:32 np0005544118 systemd[1]: Finished Load Kernel Modules.
Dec  3 09:09:32 np0005544118 python3.9[171608]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec  3 09:09:33 np0005544118 systemd[1]: virtqemud.service: Deactivated successfully.
Dec  3 09:09:35 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:35 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:35 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:35 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:35 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:35 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:35 np0005544118 systemd-logind[795]: Watching system buttons on /dev/input/event0 (Power Button)
Dec  3 09:09:35 np0005544118 systemd-logind[795]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec  3 09:09:36 np0005544118 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec  3 09:09:36 np0005544118 systemd[1]: Starting man-db-cache-update.service...
Dec  3 09:09:36 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:36 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:36 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:36 np0005544118 systemd[1]: Queuing reload/restart jobs for marked units…
Dec  3 09:09:36 np0005544118 podman[172300]: 2025-12-03 14:09:36.849752887 +0000 UTC m=+0.082593714 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:09:37 np0005544118 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec  3 09:09:37 np0005544118 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec  3 09:09:37 np0005544118 systemd[1]: Finished man-db-cache-update.service.
Dec  3 09:09:37 np0005544118 systemd[1]: man-db-cache-update.service: Consumed 1.438s CPU time.
Dec  3 09:09:37 np0005544118 systemd[1]: run-r04a34839e3714cd29ef527798afc2a0d.service: Deactivated successfully.
Dec  3 09:09:38 np0005544118 python3.9[173088]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:09:38 np0005544118 systemd[1]: Stopping Open-iSCSI...
Dec  3 09:09:38 np0005544118 iscsid[161413]: iscsid shutting down.
Dec  3 09:09:38 np0005544118 systemd[1]: iscsid.service: Deactivated successfully.
Dec  3 09:09:38 np0005544118 systemd[1]: Stopped Open-iSCSI.
Dec  3 09:09:38 np0005544118 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec  3 09:09:38 np0005544118 systemd[1]: Starting Open-iSCSI...
Dec  3 09:09:38 np0005544118 systemd[1]: Started Open-iSCSI.
Dec  3 09:09:39 np0005544118 python3.9[173242]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:09:40 np0005544118 python3.9[173398]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:41 np0005544118 python3.9[173550]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:09:41 np0005544118 systemd[1]: Reloading.
Dec  3 09:09:41 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:09:41 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:09:42 np0005544118 podman[173708]: 2025-12-03 14:09:42.113842162 +0000 UTC m=+0.108116484 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:09:42 np0005544118 python3.9[173745]: ansible-ansible.builtin.service_facts Invoked
Dec  3 09:09:42 np0005544118 network[173779]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 09:09:42 np0005544118 network[173780]: 'network-scripts' will be removed from distribution in near future.
Dec  3 09:09:42 np0005544118 network[173781]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 09:09:46 np0005544118 python3.9[174055]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:47 np0005544118 python3.9[174208]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:48 np0005544118 python3.9[174361]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:49 np0005544118 python3.9[174514]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:50 np0005544118 python3.9[174667]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:51 np0005544118 python3.9[174820]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:51 np0005544118 python3.9[174973]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:52 np0005544118 python3.9[175126]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:09:53 np0005544118 python3.9[175279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:54 np0005544118 python3.9[175431]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:54 np0005544118 python3.9[175583]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:55 np0005544118 python3.9[175735]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:55 np0005544118 python3.9[175887]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:56 np0005544118 python3.9[176039]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:56 np0005544118 python3.9[176191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:57 np0005544118 podman[176315]: 2025-12-03 14:09:57.261474252 +0000 UTC m=+0.073130788 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:09:57 np0005544118 python3.9[176362]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:58 np0005544118 python3.9[176515]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:59 np0005544118 python3.9[176667]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:09:59 np0005544118 python3.9[176819]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:00 np0005544118 python3.9[176971]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:10:00.935 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:10:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:10:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:10:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:10:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:10:01 np0005544118 python3.9[177123]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:01 np0005544118 python3.9[177275]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:02 np0005544118 python3.9[177427]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:02 np0005544118 python3.9[177579]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:04 np0005544118 python3.9[177731]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:05 np0005544118 python3.9[177883]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:10:05 np0005544118 python3.9[178035]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:10:05 np0005544118 systemd[1]: Reloading.
Dec  3 09:10:06 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:10:06 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:10:06 np0005544118 python3.9[178222]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:07 np0005544118 podman[178224]: 2025-12-03 14:10:07.034980344 +0000 UTC m=+0.067643841 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:10:07 np0005544118 python3.9[178393]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:08 np0005544118 python3.9[178546]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:08 np0005544118 python3.9[178699]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:09 np0005544118 python3.9[178852]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:09 np0005544118 python3.9[179005]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:10 np0005544118 python3.9[179158]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:11 np0005544118 python3.9[179311]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:10:12 np0005544118 podman[179436]: 2025-12-03 14:10:12.747025288 +0000 UTC m=+0.116667134 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:10:12 np0005544118 python3.9[179481]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:13 np0005544118 python3.9[179642]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:14 np0005544118 python3.9[179794]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:14 np0005544118 python3.9[179946]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:15 np0005544118 python3.9[180098]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:15 np0005544118 python3.9[180250]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:16 np0005544118 python3.9[180402]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:17 np0005544118 python3.9[180554]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:17 np0005544118 python3.9[180706]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:18 np0005544118 python3.9[180858]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:23 np0005544118 python3.9[181010]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec  3 09:10:23 np0005544118 python3.9[181163]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 09:10:25 np0005544118 python3.9[181321]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  3 09:10:26 np0005544118 systemd-logind[795]: New session 26 of user zuul.
Dec  3 09:10:26 np0005544118 systemd[1]: Started Session 26 of User zuul.
Dec  3 09:10:26 np0005544118 systemd[1]: session-26.scope: Deactivated successfully.
Dec  3 09:10:26 np0005544118 systemd-logind[795]: Session 26 logged out. Waiting for processes to exit.
Dec  3 09:10:26 np0005544118 systemd-logind[795]: Removed session 26.
Dec  3 09:10:27 np0005544118 python3.9[181507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:27 np0005544118 podman[181602]: 2025-12-03 14:10:27.594849664 +0000 UTC m=+0.089764867 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Dec  3 09:10:27 np0005544118 python3.9[181640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771026.7367685-2322-202022855535034/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:28 np0005544118 python3.9[181797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:28 np0005544118 python3.9[181873]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:29 np0005544118 python3.9[182023]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:29 np0005544118 python3.9[182144]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771028.835975-2322-19088688338151/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:30 np0005544118 python3.9[182294]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:30 np0005544118 python3.9[182415]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771029.892175-2322-171363015850582/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:31 np0005544118 python3.9[182565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:31 np0005544118 python3.9[182686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771030.9783115-2322-16467922343808/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:32 np0005544118 python3.9[182836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:32 np0005544118 python3.9[182957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771032.0465293-2322-158560696209825/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:34 np0005544118 python3.9[183109]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:35 np0005544118 python3.9[183261]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:36 np0005544118 python3.9[183413]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:36 np0005544118 python3.9[183565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:37 np0005544118 python3.9[183688]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764771036.2769141-2536-132189456559403/.source _original_basename=.kwgi3l5b follow=False checksum=9449357ff163dae2df8e3867a77c5280e9466874 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec  3 09:10:37 np0005544118 podman[183689]: 2025-12-03 14:10:37.321479301 +0000 UTC m=+0.046165528 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:10:38 np0005544118 python3.9[183859]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:38 np0005544118 python3.9[184011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:39 np0005544118 python3.9[184132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771038.538096-2588-67791799536615/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:40 np0005544118 python3.9[184282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:10:40 np0005544118 python3.9[184403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771039.7780976-2619-19437883641191/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:10:41 np0005544118 python3.9[184555]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec  3 09:10:42 np0005544118 python3.9[184707]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:10:43 np0005544118 podman[184831]: 2025-12-03 14:10:43.012482884 +0000 UTC m=+0.125907468 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller)
Dec  3 09:10:43 np0005544118 python3[184878]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:10:43 np0005544118 podman[184922]: 2025-12-03 14:10:43.37976935 +0000 UTC m=+0.054566772 container create 9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:10:43 np0005544118 podman[184922]: 2025-12-03 14:10:43.346730811 +0000 UTC m=+0.021528263 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  3 09:10:43 np0005544118 python3[184878]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec  3 09:10:44 np0005544118 python3.9[185108]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:45 np0005544118 python3.9[185262]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec  3 09:10:46 np0005544118 python3.9[185414]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:10:46 np0005544118 python3[185566]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:10:47 np0005544118 podman[185602]: 2025-12-03 14:10:47.03629266 +0000 UTC m=+0.066779397 container create 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec  3 09:10:47 np0005544118 podman[185602]: 2025-12-03 14:10:46.997236161 +0000 UTC m=+0.027722978 image pull 5571c1b2140c835f70406e4553b3b44135b9c9b4eb673345cbd571460c5d59a3 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec  3 09:10:47 np0005544118 python3[185566]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec  3 09:10:47 np0005544118 python3.9[185792]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:48 np0005544118 python3.9[185946]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:49 np0005544118 python3.9[186097]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764771048.6688657-2802-262657324114604/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:10:49 np0005544118 python3.9[186173]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:10:49 np0005544118 systemd[1]: Reloading.
Dec  3 09:10:49 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:10:49 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:10:50 np0005544118 python3.9[186284]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:10:50 np0005544118 systemd[1]: Reloading.
Dec  3 09:10:50 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:10:50 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:10:50 np0005544118 systemd[1]: Starting nova_compute container...
Dec  3 09:10:51 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:10:51 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:51 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:51 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:51 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:51 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:51 np0005544118 podman[186324]: 2025-12-03 14:10:51.048416995 +0000 UTC m=+0.102793285 container init 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:10:51 np0005544118 podman[186324]: 2025-12-03 14:10:51.055687838 +0000 UTC m=+0.110064108 container start 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + sudo -E kolla_set_configs
Dec  3 09:10:51 np0005544118 podman[186324]: nova_compute
Dec  3 09:10:51 np0005544118 systemd[1]: Started nova_compute container.
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Validating config file
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying service configuration files
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Deleting /etc/ceph
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Creating directory /etc/ceph
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /etc/ceph
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Writing out command to execute
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:51 np0005544118 nova_compute[186339]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  3 09:10:51 np0005544118 nova_compute[186339]: ++ cat /run_command
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + CMD=nova-compute
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + ARGS=
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + sudo kolla_copy_cacerts
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + [[ ! -n '' ]]
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + . kolla_extend_start
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + echo 'Running command: '\''nova-compute'\'''
Dec  3 09:10:51 np0005544118 nova_compute[186339]: Running command: 'nova-compute'
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + umask 0022
Dec  3 09:10:51 np0005544118 nova_compute[186339]: + exec nova-compute
Dec  3 09:10:52 np0005544118 python3.9[186501]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:53 np0005544118 python3.9[186651]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.161 186343 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.162 186343 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.162 186343 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.162 186343 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.305 186343 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.330 186343 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.330 186343 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  3 09:10:53 np0005544118 python3.9[186805]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:10:53 np0005544118 nova_compute[186339]: 2025-12-03 14:10:53.966 186343 INFO nova.virt.driver [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.110 186343 INFO nova.compute.provider_config [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.141 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.142 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.142 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.142 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.142 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.143 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.144 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.145 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.146 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.147 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.148 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.149 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.150 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.151 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.152 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.153 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.154 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.155 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.156 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.157 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.158 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.158 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.158 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.158 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.158 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.159 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.160 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.161 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.162 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.162 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.162 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.162 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.162 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.163 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.164 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.165 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.166 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.167 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.168 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.169 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.170 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.171 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.172 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.173 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.174 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.175 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.176 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.177 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.178 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.179 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.180 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.181 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.182 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.183 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.184 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.185 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.186 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.187 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.188 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.189 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.190 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.191 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.192 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.193 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.194 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.195 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.196 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.197 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.198 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.199 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.200 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.201 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.202 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.203 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.204 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.205 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.206 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.207 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.208 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.209 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.210 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.211 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.212 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.213 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.213 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.213 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.213 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.213 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.214 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.215 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.216 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.216 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.216 186343 WARNING oslo_config.cfg [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  3 09:10:54 np0005544118 nova_compute[186339]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  3 09:10:54 np0005544118 nova_compute[186339]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  3 09:10:54 np0005544118 nova_compute[186339]: and ``live_migration_inbound_addr`` respectively.
Dec  3 09:10:54 np0005544118 nova_compute[186339]: ).  Its value may be silently ignored in the future.#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.216 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.216 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.217 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.218 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.219 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.220 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.221 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.222 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.223 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.224 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.225 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.226 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.227 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.228 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.229 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.230 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.231 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.232 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.233 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.234 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.235 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.236 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.237 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.237 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.237 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.237 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.237 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.238 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.239 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.240 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.241 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.242 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.243 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.244 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.245 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.246 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.247 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.248 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.249 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.250 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.251 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.252 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.253 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.253 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.253 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.253 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.253 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.254 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.255 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.256 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.257 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.258 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.259 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.260 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.261 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.262 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.263 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.264 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.265 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.266 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.267 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.268 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.269 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.270 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.271 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.272 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.273 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.274 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.275 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.276 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.277 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.278 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.279 186343 DEBUG oslo_service.service [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  3 09:10:54 np0005544118 nova_compute[186339]: 2025-12-03 14:10:54.280 186343 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.055 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.056 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.056 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.056 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  3 09:10:55 np0005544118 systemd[1]: Starting libvirt QEMU daemon...
Dec  3 09:10:55 np0005544118 systemd[1]: Started libvirt QEMU daemon.
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.123 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9d3737ab50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.127 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9d3737ab50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.128 186343 INFO nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.149 186343 WARNING nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.150 186343 DEBUG nova.virt.libvirt.volume.mount [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  3 09:10:55 np0005544118 python3.9[186999]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  3 09:10:55 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:10:55 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.971 186343 INFO nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Libvirt host capabilities <capabilities>
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <host>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <uuid>5f58e40a-1e39-4838-95df-ccf3b2b3eaa6</uuid>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <cpu>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <arch>x86_64</arch>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <model>EPYC-Rome-v4</model>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <vendor>AMD</vendor>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <microcode version='16777317'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <signature family='23' model='49' stepping='0'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='x2apic'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='tsc-deadline'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='osxsave'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='hypervisor'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='tsc_adjust'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='spec-ctrl'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='stibp'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='arch-capabilities'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='ssbd'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='cmp_legacy'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='topoext'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='virt-ssbd'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='lbrv'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='tsc-scale'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='vmcb-clean'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='pause-filter'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='pfthreshold'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='svme-addr-chk'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='rdctl-no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='skip-l1dfl-vmentry'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='mds-no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <feature name='pschange-mc-no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <pages unit='KiB' size='4'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <pages unit='KiB' size='2048'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <pages unit='KiB' size='1048576'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </cpu>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <power_management>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <suspend_mem/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <suspend_disk/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <suspend_hybrid/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </power_management>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <iommu support='no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <migration_features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <live/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <uri_transports>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <uri_transport>tcp</uri_transport>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <uri_transport>rdma</uri_transport>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </uri_transports>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </migration_features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <topology>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <cells num='1'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <cell id='0'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <memory unit='KiB'>7864312</memory>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <pages unit='KiB' size='4'>1966078</pages>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <pages unit='KiB' size='2048'>0</pages>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <distances>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <sibling id='0' value='10'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          </distances>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          <cpus num='8'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:          </cpus>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        </cell>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </cells>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </topology>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <cache>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </cache>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <secmodel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <model>selinux</model>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <doi>0</doi>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </secmodel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <secmodel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <model>dac</model>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <doi>0</doi>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </secmodel>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  </host>
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <guest>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <os_type>hvm</os_type>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <arch name='i686'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <wordsize>32</wordsize>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <domain type='qemu'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <domain type='kvm'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </arch>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <pae/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <nonpae/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <acpi default='on' toggle='yes'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <apic default='on' toggle='no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <cpuselection/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <deviceboot/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <disksnapshot default='on' toggle='no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <externalSnapshot/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  </guest>
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <guest>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <os_type>hvm</os_type>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <arch name='x86_64'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <wordsize>64</wordsize>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <domain type='qemu'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <domain type='kvm'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </arch>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <acpi default='on' toggle='yes'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <apic default='on' toggle='no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <cpuselection/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <deviceboot/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <disksnapshot default='on' toggle='no'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <externalSnapshot/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </features>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  </guest>
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 
Dec  3 09:10:55 np0005544118 nova_compute[186339]: </capabilities>
Dec  3 09:10:55 np0005544118 nova_compute[186339]: #033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.979 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  3 09:10:55 np0005544118 nova_compute[186339]: 2025-12-03 14:10:55.997 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  3 09:10:55 np0005544118 nova_compute[186339]: <domainCapabilities>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <domain>kvm</domain>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <arch>i686</arch>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <vcpu max='240'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <iothreads supported='yes'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <os supported='yes'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <enum name='firmware'/>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <loader supported='yes'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>rom</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>pflash</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <enum name='readonly'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>yes</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <enum name='secure'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </loader>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  </os>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:  <cpu>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <mode name='maximum' supported='yes'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <enum name='maximumMigratable'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:    <mode name='host-model' supported='yes'>
Dec  3 09:10:55 np0005544118 nova_compute[186339]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <vendor>AMD</vendor>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='x2apic'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='stibp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='succor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lbrv'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='custom' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Dhyana-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-128'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-256'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-512'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <memoryBacking supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='sourceType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>anonymous</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>memfd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </memoryBacking>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <disk supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='diskDevice'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>disk</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cdrom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>floppy</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>lun</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ide</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>fdc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>sata</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </disk>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <graphics supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vnc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egl-headless</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </graphics>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <video supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='modelType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vga</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cirrus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>none</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>bochs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ramfb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </video>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hostdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='mode'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>subsystem</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='startupPolicy'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>mandatory</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>requisite</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>optional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='subsysType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pci</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='capsType'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='pciBackend'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hostdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <rng supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>random</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </rng>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <filesystem supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='driverType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>path</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>handle</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtiofs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </filesystem>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <tpm supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-tis</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-crb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emulator</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>external</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendVersion'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>2.0</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </tpm>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <redirdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </redirdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <channel supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </channel>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <crypto supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </crypto>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <interface supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>passt</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </interface>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <panic supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>isa</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>hyperv</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </panic>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <console supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>null</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dev</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pipe</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stdio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>udp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tcp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu-vdagent</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </console>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <gic supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <vmcoreinfo supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <genid supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backingStoreInput supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backup supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <async-teardown supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <ps2 supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sev supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sgx supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hyperv supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='features'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>relaxed</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vapic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>spinlocks</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vpindex</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>runtime</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>synic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stimer</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reset</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vendor_id</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>frequencies</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reenlightenment</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tlbflush</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ipi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>avic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emsr_bitmap</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>xmm_input</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <spinlocks>4095</spinlocks>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <stimer_direct>on</stimer_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hyperv>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <launchSecurity supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='sectype'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tdx</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </launchSecurity>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: </domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.003 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  3 09:10:56 np0005544118 nova_compute[186339]: <domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <domain>kvm</domain>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <arch>i686</arch>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <vcpu max='4096'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <iothreads supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <os supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='firmware'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <loader supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>rom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pflash</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='readonly'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>yes</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='secure'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </loader>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </os>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='maximum' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='maximumMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-model' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <vendor>AMD</vendor>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='x2apic'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='stibp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='succor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lbrv'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='custom' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Dhyana-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-128'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-256'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-512'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <memoryBacking supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='sourceType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>anonymous</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>memfd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </memoryBacking>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <disk supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='diskDevice'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>disk</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cdrom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>floppy</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>lun</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>fdc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>sata</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </disk>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <graphics supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vnc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egl-headless</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </graphics>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <video supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='modelType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vga</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cirrus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>none</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>bochs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ramfb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </video>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hostdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='mode'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>subsystem</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='startupPolicy'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>mandatory</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>requisite</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>optional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='subsysType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pci</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='capsType'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='pciBackend'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hostdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <rng supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>random</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </rng>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <filesystem supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='driverType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>path</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>handle</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtiofs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </filesystem>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <tpm supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-tis</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-crb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emulator</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>external</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendVersion'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>2.0</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </tpm>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <redirdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </redirdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <channel supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </channel>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <crypto supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </crypto>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <interface supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>passt</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </interface>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <panic supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>isa</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>hyperv</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </panic>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <console supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>null</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dev</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pipe</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stdio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>udp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tcp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu-vdagent</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </console>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <gic supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <vmcoreinfo supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <genid supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backingStoreInput supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backup supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <async-teardown supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <ps2 supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sev supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sgx supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hyperv supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='features'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>relaxed</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vapic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>spinlocks</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vpindex</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>runtime</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>synic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stimer</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reset</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vendor_id</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>frequencies</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reenlightenment</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tlbflush</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ipi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>avic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emsr_bitmap</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>xmm_input</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <spinlocks>4095</spinlocks>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <stimer_direct>on</stimer_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hyperv>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <launchSecurity supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='sectype'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tdx</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </launchSecurity>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: </domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.043 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.048 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  3 09:10:56 np0005544118 nova_compute[186339]: <domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <domain>kvm</domain>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <arch>x86_64</arch>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <vcpu max='240'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <iothreads supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <os supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='firmware'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <loader supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>rom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pflash</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='readonly'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>yes</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='secure'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </loader>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </os>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='maximum' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='maximumMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-model' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <vendor>AMD</vendor>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='x2apic'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='stibp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='succor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lbrv'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='custom' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Dhyana-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-128'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-256'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-512'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <memoryBacking supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='sourceType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>anonymous</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>memfd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </memoryBacking>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <disk supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='diskDevice'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>disk</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cdrom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>floppy</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>lun</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ide</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>fdc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>sata</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </disk>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <graphics supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vnc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egl-headless</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </graphics>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <video supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='modelType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vga</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cirrus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>none</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>bochs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ramfb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </video>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hostdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='mode'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>subsystem</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='startupPolicy'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>mandatory</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>requisite</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>optional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='subsysType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pci</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='capsType'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='pciBackend'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hostdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <rng supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>random</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </rng>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <filesystem supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='driverType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>path</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>handle</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtiofs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </filesystem>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <tpm supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-tis</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-crb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emulator</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>external</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendVersion'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>2.0</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </tpm>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <redirdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </redirdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <channel supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </channel>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <crypto supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </crypto>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <interface supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>passt</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </interface>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <panic supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>isa</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>hyperv</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </panic>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <console supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>null</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dev</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pipe</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stdio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>udp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tcp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu-vdagent</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </console>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <gic supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <vmcoreinfo supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <genid supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backingStoreInput supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backup supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <async-teardown supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <ps2 supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sev supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sgx supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hyperv supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='features'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>relaxed</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vapic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>spinlocks</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vpindex</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>runtime</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>synic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stimer</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reset</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vendor_id</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>frequencies</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reenlightenment</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tlbflush</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ipi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>avic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emsr_bitmap</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>xmm_input</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <spinlocks>4095</spinlocks>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <stimer_direct>on</stimer_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hyperv>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <launchSecurity supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='sectype'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tdx</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </launchSecurity>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: </domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.114 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  3 09:10:56 np0005544118 nova_compute[186339]: <domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <domain>kvm</domain>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <arch>x86_64</arch>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <vcpu max='4096'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <iothreads supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <os supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='firmware'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>efi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <loader supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>rom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pflash</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='readonly'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>yes</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='secure'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>yes</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>no</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </loader>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </os>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='maximum' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='maximumMigratable'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>on</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>off</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='host-model' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <vendor>AMD</vendor>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='x2apic'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='stibp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='succor'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lbrv'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <mode name='custom' supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Broadwell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Cooperlake-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Denverton-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Dhyana-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='auto-ibrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amd-psfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='no-nested-data-bp'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='null-sel-clr-base'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='stibp-always-on'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='EPYC-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-128'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-256'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx10-512'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='prefetchiti'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Haswell-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='IvyBridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='KnightsMill-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4fmaps'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-4vnniw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512er'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512pf'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fma4'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tbm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xop'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='amx-tile'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-bf16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-fp16'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bitalg'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vbmi2'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrc'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fzrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='la57'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='taa-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='tsx-ldtrk'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xfd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='SierraForest-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ifma'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-ne-convert'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx-vnni-int8'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='bus-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cmpccxadd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fbsdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='fsrs'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ibrs-all'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mcdt-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pbrsb-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='psdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='serialize'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vaes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='vpclmulqdq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='hle'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='rtm'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512bw'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512cd'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512dq'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512f'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='avx512vl'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='invpcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pcid'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='pku'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='mpx'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v2'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v3'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='core-capability'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='split-lock-detect'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='Snowridge-v4'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='cldemote'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='erms'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='gfni'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdir64b'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='movdiri'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='xsaves'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='athlon-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='core2duo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='coreduo-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='n270-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='ss'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <blockers model='phenom-v1'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnow'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <feature name='3dnowext'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </blockers>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </mode>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <memoryBacking supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <enum name='sourceType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>anonymous</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <value>memfd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </memoryBacking>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <disk supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='diskDevice'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>disk</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cdrom</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>floppy</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>lun</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>fdc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>sata</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </disk>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <graphics supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vnc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egl-headless</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </graphics>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <video supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='modelType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vga</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>cirrus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>none</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>bochs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ramfb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </video>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hostdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='mode'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>subsystem</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='startupPolicy'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>mandatory</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>requisite</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>optional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='subsysType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pci</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>scsi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='capsType'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='pciBackend'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hostdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <rng supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtio-non-transitional</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>random</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>egd</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </rng>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <filesystem supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='driverType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>path</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>handle</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>virtiofs</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </filesystem>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <tpm supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-tis</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tpm-crb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emulator</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>external</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendVersion'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>2.0</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </tpm>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <redirdev supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='bus'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>usb</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </redirdev>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <channel supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </channel>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <crypto supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendModel'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>builtin</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </crypto>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <interface supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='backendType'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>default</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>passt</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </interface>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <panic supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='model'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>isa</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>hyperv</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </panic>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <console supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='type'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>null</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vc</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pty</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dev</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>file</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>pipe</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stdio</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>udp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tcp</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>unix</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>qemu-vdagent</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>dbus</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </console>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </devices>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <gic supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <vmcoreinfo supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <genid supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backingStoreInput supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <backup supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <async-teardown supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <ps2 supported='yes'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sev supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <sgx supported='no'/>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <hyperv supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='features'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>relaxed</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vapic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>spinlocks</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vpindex</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>runtime</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>synic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>stimer</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reset</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>vendor_id</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>frequencies</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>reenlightenment</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tlbflush</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>ipi</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>avic</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>emsr_bitmap</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>xmm_input</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <spinlocks>4095</spinlocks>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <stimer_direct>on</stimer_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </defaults>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </hyperv>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    <launchSecurity supported='yes'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      <enum name='sectype'>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:        <value>tdx</value>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:      </enum>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:    </launchSecurity>
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  </features>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: </domainCapabilities>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.173 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.173 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.173 186343 DEBUG nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.174 186343 INFO nova.virt.libvirt.host [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Secure Boot support detected#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.176 186343 INFO nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.176 186343 INFO nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.187 186343 DEBUG nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] cpu compare xml: <cpu match="exact">
Dec  3 09:10:56 np0005544118 nova_compute[186339]:  <model>Nehalem</model>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: </cpu>
Dec  3 09:10:56 np0005544118 nova_compute[186339]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.191 186343 DEBUG nova.virt.libvirt.driver [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.228 186343 INFO nova.virt.node [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Determined node identity 52e95542-7192-4eec-a5dc-18596ad73a72 from /var/lib/nova/compute_id#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.247 186343 WARNING nova.compute.manager [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Compute nodes ['52e95542-7192-4eec-a5dc-18596ad73a72'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.282 186343 INFO nova.compute.manager [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  3 09:10:56 np0005544118 python3.9[187197]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.316 186343 WARNING nova.compute.manager [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.317 186343 DEBUG oslo_concurrency.lockutils [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.318 186343 DEBUG oslo_concurrency.lockutils [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.319 186343 DEBUG oslo_concurrency.lockutils [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.319 186343 DEBUG nova.compute.resource_tracker [None req-6b77ae4f-c403-4aa6-b54a-566d6df6951b - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:10:56 np0005544118 systemd[1]: Stopping nova_compute container...
Dec  3 09:10:56 np0005544118 systemd[1]: Starting libvirt nodedev daemon...
Dec  3 09:10:56 np0005544118 systemd[1]: Started libvirt nodedev daemon.
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.404 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.405 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:10:56 np0005544118 nova_compute[186339]: 2025-12-03 14:10:56.405 186343 DEBUG oslo_concurrency.lockutils [None req-f7e8f26a-799c-45fa-ba6f-2830f306a6ea - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:10:56 np0005544118 virtqemud[186958]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec  3 09:10:56 np0005544118 virtqemud[186958]: hostname: compute-1
Dec  3 09:10:56 np0005544118 virtqemud[186958]: End of file while reading data: Input/output error
Dec  3 09:10:56 np0005544118 systemd[1]: libpod-096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08.scope: Deactivated successfully.
Dec  3 09:10:56 np0005544118 systemd[1]: libpod-096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08.scope: Consumed 3.161s CPU time.
Dec  3 09:10:56 np0005544118 podman[187201]: 2025-12-03 14:10:56.905230337 +0000 UTC m=+0.548992408 container died 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:10:56 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08-userdata-shm.mount: Deactivated successfully.
Dec  3 09:10:56 np0005544118 systemd[1]: var-lib-containers-storage-overlay-2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a-merged.mount: Deactivated successfully.
Dec  3 09:10:57 np0005544118 podman[187201]: 2025-12-03 14:10:57.077157468 +0000 UTC m=+0.720919539 container cleanup 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=edpm, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3)
Dec  3 09:10:57 np0005544118 podman[187201]: nova_compute
Dec  3 09:10:57 np0005544118 podman[187254]: nova_compute
Dec  3 09:10:57 np0005544118 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec  3 09:10:57 np0005544118 systemd[1]: Stopped nova_compute container.
Dec  3 09:10:57 np0005544118 systemd[1]: Starting nova_compute container...
Dec  3 09:10:57 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:10:57 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:57 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:57 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:57 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:57 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b7a2a85e0d2ec7ef3e82e1dfce9504bfe50140c93054535b8d642973dac090a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:57 np0005544118 podman[187267]: 2025-12-03 14:10:57.312740204 +0000 UTC m=+0.138084155 container init 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Dec  3 09:10:57 np0005544118 podman[187267]: 2025-12-03 14:10:57.319088562 +0000 UTC m=+0.144432483 container start 096b742398c170bac43c245f56e96ab1f664c763fcc9855fa719880baceecb08 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:10:57 np0005544118 podman[187267]: nova_compute
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + sudo -E kolla_set_configs
Dec  3 09:10:57 np0005544118 systemd[1]: Started nova_compute container.
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Validating config file
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying service configuration files
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /etc/ceph
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Creating directory /etc/ceph
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /etc/ceph
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Writing out command to execute
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:57 np0005544118 nova_compute[187283]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec  3 09:10:57 np0005544118 nova_compute[187283]: ++ cat /run_command
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + CMD=nova-compute
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + ARGS=
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + sudo kolla_copy_cacerts
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + [[ ! -n '' ]]
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + . kolla_extend_start
Dec  3 09:10:57 np0005544118 nova_compute[187283]: Running command: 'nova-compute'
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + echo 'Running command: '\''nova-compute'\'''
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + umask 0022
Dec  3 09:10:57 np0005544118 nova_compute[187283]: + exec nova-compute
Dec  3 09:10:57 np0005544118 podman[187394]: 2025-12-03 14:10:57.836386304 +0000 UTC m=+0.060187121 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec  3 09:10:58 np0005544118 python3.9[187466]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec  3 09:10:58 np0005544118 systemd[1]: Started libpod-conmon-9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e.scope.
Dec  3 09:10:58 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:10:58 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8f2cb59cdb13eb3fa12230bf7525495939ac79a46ca7f295ce62361d5eb7166/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:58 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8f2cb59cdb13eb3fa12230bf7525495939ac79a46ca7f295ce62361d5eb7166/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:58 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b8f2cb59cdb13eb3fa12230bf7525495939ac79a46ca7f295ce62361d5eb7166/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec  3 09:10:58 np0005544118 podman[187493]: 2025-12-03 14:10:58.506131631 +0000 UTC m=+0.270634176 container init 9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:10:58 np0005544118 podman[187493]: 2025-12-03 14:10:58.513511507 +0000 UTC m=+0.278014032 container start 9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute_init, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  3 09:10:58 np0005544118 python3.9[187466]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Applying nova statedir ownership
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec  3 09:10:58 np0005544118 nova_compute_init[187514]: INFO:nova_statedir:Nova statedir ownership complete
Dec  3 09:10:58 np0005544118 systemd[1]: libpod-9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e.scope: Deactivated successfully.
Dec  3 09:10:58 np0005544118 podman[187515]: 2025-12-03 14:10:58.581144686 +0000 UTC m=+0.031109349 container died 9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init)
Dec  3 09:10:58 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e-userdata-shm.mount: Deactivated successfully.
Dec  3 09:10:58 np0005544118 systemd[1]: var-lib-containers-storage-overlay-b8f2cb59cdb13eb3fa12230bf7525495939ac79a46ca7f295ce62361d5eb7166-merged.mount: Deactivated successfully.
Dec  3 09:10:58 np0005544118 podman[187525]: 2025-12-03 14:10:58.718724664 +0000 UTC m=+0.129912225 container cleanup 9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:10:58 np0005544118 systemd[1]: libpod-conmon-9d40691430b94915b0373d1ab02fc1158c351685e19071ed9ccb20b7361cdd9e.scope: Deactivated successfully.
Dec  3 09:10:59 np0005544118 systemd[1]: session-25.scope: Deactivated successfully.
Dec  3 09:10:59 np0005544118 systemd[1]: session-25.scope: Consumed 1min 48.698s CPU time.
Dec  3 09:10:59 np0005544118 systemd-logind[795]: Session 25 logged out. Waiting for processes to exit.
Dec  3 09:10:59 np0005544118 systemd-logind[795]: Removed session 25.
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.415 187287 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.416 187287 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.416 187287 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.416 187287 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.572 187287 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.597 187287 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:10:59 np0005544118 nova_compute[187283]: 2025-12-03 14:10:59.597 187287 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.049 187287 INFO nova.virt.driver [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.175 187287 INFO nova.compute.provider_config [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.185 187287 DEBUG oslo_concurrency.lockutils [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.185 187287 DEBUG oslo_concurrency.lockutils [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.185 187287 DEBUG oslo_concurrency.lockutils [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.185 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.185 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.186 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.187 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.188 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.189 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.190 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.190 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.190 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.190 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.190 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.191 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.192 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.193 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.194 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.195 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.196 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.197 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.197 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.197 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.197 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.197 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.198 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.199 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.200 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.201 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.202 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.203 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.204 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.205 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.206 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.207 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.208 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.209 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.210 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.211 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.212 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.212 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.212 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.212 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.212 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.213 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.214 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.215 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.216 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.217 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.218 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.219 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.220 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.221 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.222 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.222 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.222 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.222 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.222 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.223 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.223 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.223 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.223 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.223 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.224 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.225 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.226 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.227 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.228 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.229 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.230 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.231 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.232 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.233 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.233 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.233 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.233 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.233 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.234 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.235 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.236 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.237 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.238 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.238 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.238 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.238 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.238 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.239 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.240 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.240 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.240 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.240 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.241 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.242 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.243 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.244 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.245 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.246 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.247 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.247 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.247 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.247 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.247 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.248 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.249 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.250 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.251 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.252 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.252 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.252 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.252 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.252 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.253 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.253 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.253 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.253 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.253 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.254 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.254 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.254 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.254 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.254 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.255 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.256 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.256 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.256 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.256 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.256 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.257 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.257 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.257 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.257 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.257 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.258 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.258 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.258 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.258 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.258 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.259 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.259 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.259 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.259 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.259 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.260 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.260 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.260 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.260 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.260 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.261 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.262 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.262 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.262 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.262 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.262 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.263 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.263 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.263 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.263 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.264 187287 WARNING oslo_config.cfg [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec  3 09:11:00 np0005544118 nova_compute[187283]: live_migration_uri is deprecated for removal in favor of two other options that
Dec  3 09:11:00 np0005544118 nova_compute[187283]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec  3 09:11:00 np0005544118 nova_compute[187283]: and ``live_migration_inbound_addr`` respectively.
Dec  3 09:11:00 np0005544118 nova_compute[187283]: ).  Its value may be silently ignored in the future.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.264 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.264 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.264 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.264 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.265 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.265 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.265 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.265 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.265 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.266 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.266 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.266 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.266 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.266 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.267 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.267 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.267 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.267 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.267 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.268 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.268 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.268 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.268 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.268 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.269 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.269 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.269 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.269 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.269 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.270 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.270 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.270 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.270 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.271 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.271 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.271 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.271 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.271 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.272 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.273 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.274 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.275 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.276 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.277 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.278 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.279 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.280 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.281 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.282 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.283 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.284 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.284 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.284 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.284 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.284 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.285 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.286 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.286 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.286 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.286 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.286 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.287 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.288 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.288 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.288 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.288 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.289 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.290 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.290 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.290 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.290 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.290 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.291 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.292 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.292 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.292 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.292 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.292 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.293 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.294 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.295 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.295 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.295 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.295 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.296 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.297 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.298 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.299 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.299 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.299 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.299 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.299 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.300 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.301 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.302 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.303 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.303 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.303 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.303 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.303 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.304 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.305 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.306 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.307 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.307 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.307 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.307 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.308 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.309 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.310 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.311 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.312 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.313 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.314 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.315 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.316 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.316 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.316 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.316 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.316 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.317 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.317 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.317 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.317 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.317 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.318 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.319 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.319 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.319 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.319 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.319 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.320 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.320 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.320 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.320 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.320 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.321 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.322 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.323 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.324 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.325 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.326 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.327 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.328 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.329 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.329 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.329 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.329 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.329 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.330 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.330 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.330 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.330 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.330 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.331 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.331 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.331 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.331 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.331 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.332 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.332 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.332 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.332 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.332 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.333 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.334 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.335 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.336 187287 DEBUG oslo_service.service [None req-04bbfa41-0037-4b5e-8a23-39a7cd174abd - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.337 187287 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.388 187287 INFO nova.virt.node [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Determined node identity 52e95542-7192-4eec-a5dc-18596ad73a72 from /var/lib/nova/compute_id#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.389 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.390 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.390 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.391 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.404 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f92bc056670> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.406 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f92bc056670> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.407 187287 INFO nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Connection event '1' reason 'None'#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.414 187287 INFO nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt host capabilities <capabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <host>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <uuid>5f58e40a-1e39-4838-95df-ccf3b2b3eaa6</uuid>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <arch>x86_64</arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model>EPYC-Rome-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <vendor>AMD</vendor>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <microcode version='16777317'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <signature family='23' model='49' stepping='0'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <maxphysaddr mode='emulate' bits='40'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='x2apic'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='tsc-deadline'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='osxsave'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='hypervisor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='tsc_adjust'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='spec-ctrl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='stibp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='arch-capabilities'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='cmp_legacy'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='topoext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='virt-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='lbrv'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='tsc-scale'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='vmcb-clean'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='pause-filter'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='pfthreshold'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='svme-addr-chk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='rdctl-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='skip-l1dfl-vmentry'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='mds-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature name='pschange-mc-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <pages unit='KiB' size='4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <pages unit='KiB' size='2048'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <pages unit='KiB' size='1048576'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <power_management>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <suspend_mem/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <suspend_disk/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <suspend_hybrid/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </power_management>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <iommu support='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <migration_features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <live/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <uri_transports>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <uri_transport>tcp</uri_transport>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <uri_transport>rdma</uri_transport>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </uri_transports>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </migration_features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <topology>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <cells num='1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <cell id='0'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <memory unit='KiB'>7864312</memory>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <pages unit='KiB' size='4'>1966078</pages>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <pages unit='KiB' size='2048'>0</pages>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <pages unit='KiB' size='1048576'>0</pages>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <distances>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <sibling id='0' value='10'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          </distances>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          <cpus num='8'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:          </cpus>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        </cell>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </cells>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </topology>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <cache>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </cache>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <secmodel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model>selinux</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <doi>0</doi>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </secmodel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <secmodel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model>dac</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <doi>0</doi>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </secmodel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </host>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <guest>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <os_type>hvm</os_type>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <arch name='i686'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <wordsize>32</wordsize>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <domain type='qemu'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <domain type='kvm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <pae/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <nonpae/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <acpi default='on' toggle='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <apic default='on' toggle='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <cpuselection/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <deviceboot/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <disksnapshot default='on' toggle='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <externalSnapshot/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </guest>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <guest>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <os_type>hvm</os_type>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <arch name='x86_64'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <wordsize>64</wordsize>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <domain type='qemu'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <domain type='kvm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <acpi default='on' toggle='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <apic default='on' toggle='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <cpuselection/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <deviceboot/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <disksnapshot default='on' toggle='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <externalSnapshot/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </guest>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </capabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: #033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.420 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.424 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec  3 09:11:00 np0005544118 nova_compute[187283]: <domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <domain>kvm</domain>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <arch>i686</arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <vcpu max='240'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <iothreads supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <os supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='firmware'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <loader supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>rom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pflash</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='readonly'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>yes</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='secure'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </loader>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='maximum' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='maximumMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-model' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <vendor>AMD</vendor>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='x2apic'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='stibp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='succor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lbrv'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='custom' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Dhyana-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-128'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-256'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-512'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <memoryBacking supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='sourceType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>anonymous</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>memfd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </memoryBacking>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <disk supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='diskDevice'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>disk</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cdrom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>floppy</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>lun</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ide</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>fdc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>sata</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <graphics supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vnc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egl-headless</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </graphics>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <video supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='modelType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vga</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cirrus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>none</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>bochs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ramfb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hostdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='mode'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>subsystem</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='startupPolicy'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>mandatory</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>requisite</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>optional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='subsysType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pci</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='capsType'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='pciBackend'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hostdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <rng supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>random</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <filesystem supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='driverType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>path</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>handle</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtiofs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </filesystem>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <tpm supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-tis</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-crb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emulator</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>external</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendVersion'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>2.0</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </tpm>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <redirdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </redirdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <channel supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </channel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <crypto supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </crypto>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <interface supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>passt</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <panic supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>isa</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>hyperv</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </panic>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <console supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>null</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dev</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pipe</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stdio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>udp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tcp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu-vdagent</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </console>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <gic supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <vmcoreinfo supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <genid supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backingStoreInput supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backup supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <async-teardown supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <ps2 supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sev supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sgx supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hyperv supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='features'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>relaxed</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vapic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>spinlocks</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vpindex</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>runtime</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>synic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stimer</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reset</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vendor_id</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>frequencies</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reenlightenment</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tlbflush</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ipi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>avic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emsr_bitmap</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>xmm_input</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <spinlocks>4095</spinlocks>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <stimer_direct>on</stimer_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hyperv>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <launchSecurity supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='sectype'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tdx</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </launchSecurity>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.429 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec  3 09:11:00 np0005544118 nova_compute[187283]: <domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <domain>kvm</domain>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <arch>i686</arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <vcpu max='4096'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <iothreads supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <os supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='firmware'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <loader supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>rom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pflash</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='readonly'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>yes</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='secure'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </loader>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='maximum' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='maximumMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-model' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <vendor>AMD</vendor>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='x2apic'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='stibp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='succor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lbrv'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='custom' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Dhyana-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-128'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-256'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-512'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <memoryBacking supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='sourceType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>anonymous</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>memfd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </memoryBacking>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <disk supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='diskDevice'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>disk</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cdrom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>floppy</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>lun</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>fdc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>sata</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <graphics supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vnc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egl-headless</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </graphics>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <video supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='modelType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vga</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cirrus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>none</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>bochs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ramfb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hostdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='mode'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>subsystem</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='startupPolicy'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>mandatory</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>requisite</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>optional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='subsysType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pci</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='capsType'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='pciBackend'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hostdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <rng supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>random</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <filesystem supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='driverType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>path</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>handle</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtiofs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </filesystem>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <tpm supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-tis</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-crb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emulator</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>external</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendVersion'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>2.0</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </tpm>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <redirdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </redirdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <channel supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </channel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <crypto supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </crypto>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <interface supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>passt</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <panic supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>isa</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>hyperv</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </panic>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <console supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>null</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dev</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pipe</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stdio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>udp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tcp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu-vdagent</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </console>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <gic supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <vmcoreinfo supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <genid supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backingStoreInput supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backup supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <async-teardown supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <ps2 supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sev supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sgx supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hyperv supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='features'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>relaxed</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vapic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>spinlocks</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vpindex</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>runtime</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>synic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stimer</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reset</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vendor_id</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>frequencies</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reenlightenment</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tlbflush</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ipi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>avic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emsr_bitmap</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>xmm_input</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <spinlocks>4095</spinlocks>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <stimer_direct>on</stimer_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hyperv>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <launchSecurity supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='sectype'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tdx</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </launchSecurity>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.457 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.458 187287 DEBUG nova.virt.libvirt.volume.mount [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.461 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec  3 09:11:00 np0005544118 nova_compute[187283]: <domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <domain>kvm</domain>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <arch>x86_64</arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <vcpu max='240'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <iothreads supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <os supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='firmware'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <loader supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>rom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pflash</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='readonly'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>yes</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='secure'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </loader>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='maximum' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='maximumMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-model' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <vendor>AMD</vendor>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='x2apic'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='stibp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='succor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lbrv'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='custom' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Dhyana-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-128'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-256'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-512'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <memoryBacking supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='sourceType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>anonymous</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>memfd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </memoryBacking>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <disk supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='diskDevice'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>disk</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cdrom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>floppy</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>lun</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ide</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>fdc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>sata</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <graphics supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vnc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egl-headless</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </graphics>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <video supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='modelType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vga</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cirrus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>none</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>bochs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ramfb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hostdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='mode'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>subsystem</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='startupPolicy'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>mandatory</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>requisite</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>optional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='subsysType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pci</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='capsType'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='pciBackend'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hostdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <rng supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>random</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <filesystem supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='driverType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>path</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>handle</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtiofs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </filesystem>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <tpm supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-tis</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-crb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emulator</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>external</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendVersion'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>2.0</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </tpm>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <redirdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </redirdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <channel supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </channel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <crypto supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </crypto>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <interface supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>passt</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <panic supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>isa</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>hyperv</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </panic>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <console supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>null</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dev</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pipe</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stdio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>udp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tcp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu-vdagent</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </console>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <gic supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <vmcoreinfo supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <genid supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backingStoreInput supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backup supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <async-teardown supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <ps2 supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sev supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sgx supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hyperv supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='features'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>relaxed</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vapic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>spinlocks</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vpindex</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>runtime</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>synic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stimer</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reset</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vendor_id</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>frequencies</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reenlightenment</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tlbflush</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ipi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>avic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emsr_bitmap</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>xmm_input</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <spinlocks>4095</spinlocks>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <stimer_direct>on</stimer_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hyperv>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <launchSecurity supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='sectype'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tdx</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </launchSecurity>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.518 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec  3 09:11:00 np0005544118 nova_compute[187283]: <domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <path>/usr/libexec/qemu-kvm</path>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <domain>kvm</domain>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <machine>pc-q35-rhel9.8.0</machine>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <arch>x86_64</arch>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <vcpu max='4096'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <iothreads supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <os supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='firmware'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>efi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <loader supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>rom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pflash</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='readonly'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>yes</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='secure'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>yes</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>no</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </loader>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-passthrough' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='hostPassthroughMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='maximum' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='maximumMigratable'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>on</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>off</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='host-model' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model fallback='forbid'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <vendor>AMD</vendor>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='x2apic'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-deadline'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='hypervisor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc_adjust'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='spec-ctrl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='stibp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='cmp_legacy'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='overflow-recov'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='succor'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='amd-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='virt-ssbd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lbrv'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='tsc-scale'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='vmcb-clean'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='flushbyasid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pause-filter'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='pfthreshold'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='svme-addr-chk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='require' name='lfence-always-serializing'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <feature policy='disable' name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <mode name='custom' supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Broadwell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cascadelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Cooperlake-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Denverton-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Dhyana-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Genoa-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='auto-ibrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Milan-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amd-psfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='no-nested-data-bp'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='null-sel-clr-base'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='stibp-always-on'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-Rome-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='EPYC-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='GraniteRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-128'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-256'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx10-512'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='prefetchiti'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Haswell-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-noTSX'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v6'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Icelake-Server-v7'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='IvyBridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='KnightsMill-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4fmaps'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-4vnniw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512er'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512pf'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G4-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Opteron_G5-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fma4'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tbm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xop'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SapphireRapids-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='amx-tile'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-bf16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-fp16'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512-vpopcntdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bitalg'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vbmi2'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrc'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fzrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='la57'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='taa-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='tsx-ldtrk'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xfd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='SierraForest-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ifma'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-ne-convert'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx-vnni-int8'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='bus-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cmpccxadd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fbsdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='fsrs'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ibrs-all'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mcdt-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pbrsb-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='psdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='sbdr-ssdp-no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='serialize'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vaes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='vpclmulqdq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Client-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='hle'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='rtm'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Skylake-Server-v5'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512bw'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512cd'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512dq'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512f'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='avx512vl'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='invpcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pcid'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='pku'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='mpx'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v2'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v3'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='core-capability'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='split-lock-detect'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='Snowridge-v4'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='cldemote'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='erms'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='gfni'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdir64b'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='movdiri'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='xsaves'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='athlon-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='core2duo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='coreduo-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='n270-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='ss'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <blockers model='phenom-v1'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnow'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <feature name='3dnowext'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </blockers>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </mode>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <memoryBacking supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <enum name='sourceType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>anonymous</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <value>memfd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </memoryBacking>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <disk supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='diskDevice'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>disk</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cdrom</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>floppy</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>lun</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>fdc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>sata</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <graphics supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vnc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egl-headless</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </graphics>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <video supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='modelType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vga</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>cirrus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>none</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>bochs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ramfb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hostdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='mode'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>subsystem</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='startupPolicy'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>mandatory</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>requisite</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>optional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='subsysType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pci</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>scsi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='capsType'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='pciBackend'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hostdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <rng supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtio-non-transitional</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>random</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>egd</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <filesystem supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='driverType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>path</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>handle</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>virtiofs</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </filesystem>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <tpm supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-tis</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tpm-crb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emulator</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>external</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendVersion'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>2.0</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </tpm>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <redirdev supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='bus'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>usb</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </redirdev>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <channel supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </channel>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <crypto supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendModel'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>builtin</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </crypto>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <interface supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='backendType'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>default</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>passt</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <panic supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='model'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>isa</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>hyperv</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </panic>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <console supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='type'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>null</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vc</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pty</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dev</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>file</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>pipe</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stdio</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>udp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tcp</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>unix</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>qemu-vdagent</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>dbus</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </console>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <gic supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <vmcoreinfo supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <genid supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backingStoreInput supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <backup supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <async-teardown supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <ps2 supported='yes'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sev supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <sgx supported='no'/>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <hyperv supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='features'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>relaxed</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vapic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>spinlocks</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vpindex</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>runtime</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>synic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>stimer</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reset</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>vendor_id</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>frequencies</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>reenlightenment</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tlbflush</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>ipi</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>avic</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>emsr_bitmap</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>xmm_input</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <spinlocks>4095</spinlocks>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <stimer_direct>on</stimer_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_direct>on</tlbflush_direct>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <tlbflush_extended>on</tlbflush_extended>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </defaults>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </hyperv>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    <launchSecurity supported='yes'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      <enum name='sectype'>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:        <value>tdx</value>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:      </enum>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:    </launchSecurity>
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </domainCapabilities>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.585 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.586 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.586 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.586 187287 INFO nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Secure Boot support detected#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.588 187287 INFO nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.589 187287 INFO nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.598 187287 DEBUG nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] cpu compare xml: <cpu match="exact">
Dec  3 09:11:00 np0005544118 nova_compute[187283]:  <model>Nehalem</model>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: </cpu>
Dec  3 09:11:00 np0005544118 nova_compute[187283]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.599 187287 DEBUG nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.617 187287 INFO nova.virt.node [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Determined node identity 52e95542-7192-4eec-a5dc-18596ad73a72 from /var/lib/nova/compute_id#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.637 187287 WARNING nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Compute nodes ['52e95542-7192-4eec-a5dc-18596ad73a72'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.667 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.688 187287 WARNING nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.688 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.688 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.689 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.689 187287 DEBUG nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.853 187287 WARNING nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.854 187287 DEBUG nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6185MB free_disk=73.53902053833008GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.855 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.855 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.867 187287 WARNING nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] No compute node record for compute-1.ctlplane.example.com:52e95542-7192-4eec-a5dc-18596ad73a72: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 52e95542-7192-4eec-a5dc-18596ad73a72 could not be found.#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.885 187287 INFO nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 52e95542-7192-4eec-a5dc-18596ad73a72#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.928 187287 DEBUG nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:11:00 np0005544118 nova_compute[187283]: 2025-12-03 14:11:00.928 187287 DEBUG nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:11:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:11:00.936 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:11:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:11:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:11:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:11:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.457 187287 INFO nova.scheduler.client.report [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [req-e3b452ef-f154-4373-821d-4abb17e304a2] Created resource provider record via placement API for resource provider with UUID 52e95542-7192-4eec-a5dc-18596ad73a72 and name compute-1.ctlplane.example.com.#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.487 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec  3 09:11:01 np0005544118 nova_compute[187283]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.488 187287 INFO nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.488 187287 DEBUG nova.compute.provider_tree [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.488 187287 DEBUG nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.490 187287 DEBUG nova.virt.libvirt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Libvirt baseline CPU <cpu>
Dec  3 09:11:01 np0005544118 nova_compute[187283]:  <arch>x86_64</arch>
Dec  3 09:11:01 np0005544118 nova_compute[187283]:  <model>Nehalem</model>
Dec  3 09:11:01 np0005544118 nova_compute[187283]:  <vendor>AMD</vendor>
Dec  3 09:11:01 np0005544118 nova_compute[187283]:  <topology sockets="8" cores="1" threads="1"/>
Dec  3 09:11:01 np0005544118 nova_compute[187283]: </cpu>
Dec  3 09:11:01 np0005544118 nova_compute[187283]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.556 187287 DEBUG nova.scheduler.client.report [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Updated inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.556 187287 DEBUG nova.compute.provider_tree [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Updating resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.556 187287 DEBUG nova.compute.provider_tree [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.638 187287 DEBUG nova.compute.provider_tree [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Updating resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.665 187287 DEBUG nova.compute.resource_tracker [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.665 187287 DEBUG oslo_concurrency.lockutils [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.810s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.665 187287 DEBUG nova.service [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.746 187287 DEBUG nova.service [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec  3 09:11:01 np0005544118 nova_compute[187283]: 2025-12-03 14:11:01.747 187287 DEBUG nova.servicegroup.drivers.db [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec  3 09:11:05 np0005544118 systemd-logind[795]: New session 27 of user zuul.
Dec  3 09:11:05 np0005544118 systemd[1]: Started Session 27 of User zuul.
Dec  3 09:11:06 np0005544118 python3.9[187753]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec  3 09:11:07 np0005544118 podman[187881]: 2025-12-03 14:11:07.758506936 +0000 UTC m=+0.081156569 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  3 09:11:08 np0005544118 python3.9[187926]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:11:08 np0005544118 systemd[1]: Reloading.
Dec  3 09:11:08 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:11:08 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:11:09 np0005544118 python3.9[188113]: ansible-ansible.builtin.service_facts Invoked
Dec  3 09:11:09 np0005544118 network[188130]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec  3 09:11:09 np0005544118 network[188131]: 'network-scripts' will be removed from distribution in near future.
Dec  3 09:11:09 np0005544118 network[188132]: It is advised to switch to 'NetworkManager' instead for network management.
Dec  3 09:11:13 np0005544118 podman[188214]: 2025-12-03 14:11:13.69850971 +0000 UTC m=+0.087449016 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  3 09:11:16 np0005544118 python3.9[188432]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:11:16 np0005544118 nova_compute[187283]: 2025-12-03 14:11:16.749 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:16 np0005544118 nova_compute[187283]: 2025-12-03 14:11:16.779 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:17 np0005544118 python3.9[188585]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:17 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:11:17 np0005544118 python3.9[188738]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:18 np0005544118 python3.9[188890]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:11:19 np0005544118 python3.9[189042]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:11:20 np0005544118 python3.9[189194]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:11:20 np0005544118 systemd[1]: Reloading.
Dec  3 09:11:20 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:11:20 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:11:21 np0005544118 python3.9[189381]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:11:22 np0005544118 python3.9[189534]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:11:23 np0005544118 python3.9[189684]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:11:23 np0005544118 python3.9[189836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:24 np0005544118 python3.9[189957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771083.493464-247-59094085358531/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:11:25 np0005544118 python3.9[190109]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec  3 09:11:26 np0005544118 python3.9[190261]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec  3 09:11:27 np0005544118 python3.9[190414]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec  3 09:11:28 np0005544118 podman[190544]: 2025-12-03 14:11:28.223823072 +0000 UTC m=+0.060167231 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:11:28 np0005544118 python3.9[190591]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec  3 09:11:29 np0005544118 python3.9[190750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:30 np0005544118 python3.9[190871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764771089.3893197-383-278891882322063/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:30 np0005544118 python3.9[191021]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:31 np0005544118 python3.9[191142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764771090.4354155-383-65457244693223/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:32 np0005544118 python3.9[191292]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:32 np0005544118 python3.9[191413]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764771091.5233717-383-172180773485041/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:33 np0005544118 python3.9[191563]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:11:34 np0005544118 python3.9[191715]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:11:35 np0005544118 python3.9[191867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:35 np0005544118 python3.9[191988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771094.5551312-501-68317294810467/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:36 np0005544118 python3.9[192138]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:36 np0005544118 python3.9[192214]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:37 np0005544118 python3.9[192364]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:37 np0005544118 python3.9[192485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771096.668842-501-108555412924184/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:38 np0005544118 podman[192609]: 2025-12-03 14:11:38.054267056 +0000 UTC m=+0.056517884 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125)
Dec  3 09:11:38 np0005544118 python3.9[192647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:38 np0005544118 python3.9[192775]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771097.7555156-501-86496119896043/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:39 np0005544118 python3.9[192925]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:39 np0005544118 python3.9[193046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771098.8514795-501-279470403668788/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:40 np0005544118 python3.9[193196]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:41 np0005544118 python3.9[193318]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771099.939886-501-94165982271417/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:41 np0005544118 python3.9[193468]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:42 np0005544118 python3.9[193589]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771101.2963398-501-188915421551687/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:43 np0005544118 python3.9[193739]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:43 np0005544118 python3.9[193860]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771102.5387213-501-132925894407185/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:43 np0005544118 podman[193908]: 2025-12-03 14:11:43.86412696 +0000 UTC m=+0.090058688 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  3 09:11:44 np0005544118 python3.9[194038]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:44 np0005544118 python3.9[194159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771103.765111-501-61406382005113/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:45 np0005544118 python3.9[194309]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:45 np0005544118 python3.9[194430]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771104.8915606-501-65458768555662/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:46 np0005544118 python3.9[194580]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:46 np0005544118 python3.9[194701]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771106.0766265-501-222041494833759/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:48 np0005544118 python3.9[194851]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:49 np0005544118 python3.9[194927]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:50 np0005544118 python3.9[195077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:50 np0005544118 python3.9[195153]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:51 np0005544118 python3.9[195303]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:51 np0005544118 python3.9[195379]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:52 np0005544118 python3.9[195531]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:53 np0005544118 python3.9[195683]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:11:54 np0005544118 python3.9[195835]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:11:55 np0005544118 python3.9[195987]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:11:55 np0005544118 systemd[1]: Reloading.
Dec  3 09:11:55 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:11:55 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:11:55 np0005544118 systemd[1]: Listening on Podman API Socket.
Dec  3 09:11:56 np0005544118 python3.9[196178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:11:56 np0005544118 python3.9[196301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771115.8020892-945-97401757376704/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:11:57 np0005544118 python3.9[196454]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec  3 09:11:58 np0005544118 podman[196579]: 2025-12-03 14:11:58.581424492 +0000 UTC m=+0.071613334 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125)
Dec  3 09:11:58 np0005544118 python3.9[196623]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.610 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.613 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.613 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.614 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.637 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.637 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.639 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.639 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.639 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.675 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.676 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.676 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.676 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.814 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.815 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6155MB free_disk=73.53948593139648GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.815 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.815 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.888 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.888 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.906 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.924 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.925 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:11:59 np0005544118 nova_compute[187283]: 2025-12-03 14:11:59.925 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:12:00 np0005544118 python3[196777]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:12:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:12:00.937 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:12:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:12:00.938 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:12:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:12:00.939 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:12:03 np0005544118 podman[196791]: 2025-12-03 14:12:03.023100566 +0000 UTC m=+2.839932433 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  3 09:12:03 np0005544118 podman[196890]: 2025-12-03 14:12:03.191430361 +0000 UTC m=+0.086606842 container create 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm)
Dec  3 09:12:03 np0005544118 podman[196890]: 2025-12-03 14:12:03.168001616 +0000 UTC m=+0.063178107 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec  3 09:12:03 np0005544118 python3[196777]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec  3 09:12:04 np0005544118 python3.9[197080]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:12:04 np0005544118 python3.9[197234]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:05 np0005544118 python3.9[197385]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764771124.9460974-1051-224786211923315/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:06 np0005544118 python3.9[197463]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:12:06 np0005544118 systemd[1]: Reloading.
Dec  3 09:12:06 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:12:06 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:12:07 np0005544118 python3.9[197573]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:12:07 np0005544118 systemd[1]: Reloading.
Dec  3 09:12:07 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:12:07 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:12:07 np0005544118 systemd[1]: Starting podman_exporter container...
Dec  3 09:12:08 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:12:08 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd45d40ef227d12be03b96d1a0715cd0c29fd1a4f9b84e1e86c16874d8e7076c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:08 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd45d40ef227d12be03b96d1a0715cd0c29fd1a4f9b84e1e86c16874d8e7076c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:08 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.
Dec  3 09:12:08 np0005544118 podman[197613]: 2025-12-03 14:12:08.054497684 +0000 UTC m=+0.117783873 container init 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.068Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.068Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.068Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.069Z caller=handler.go:105 level=info collector=container
Dec  3 09:12:08 np0005544118 systemd[1]: Starting Podman API Service...
Dec  3 09:12:08 np0005544118 systemd[1]: Started Podman API Service.
Dec  3 09:12:08 np0005544118 podman[197613]: 2025-12-03 14:12:08.095747717 +0000 UTC m=+0.159033906 container start 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:12:08 np0005544118 podman[197613]: podman_exporter
Dec  3 09:12:08 np0005544118 systemd[1]: Started podman_exporter container.
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="Setting parallel job count to 25"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="Using sqlite as database backend"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec  3 09:12:08 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:12:08 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  3 09:12:08 np0005544118 podman[197639]: time="2025-12-03T14:12:08Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:12:08 np0005544118 podman[197637]: 2025-12-03 14:12:08.169381755 +0000 UTC m=+0.066961722 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Dec  3 09:12:08 np0005544118 podman[197640]: 2025-12-03 14:12:08.176330519 +0000 UTC m=+0.069313108 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:12:08 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:12:08 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14065 "" "Go-http-client/1.1"
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.177Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.178Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  3 09:12:08 np0005544118 podman_exporter[197628]: ts=2025-12-03T14:12:08.178Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  3 09:12:08 np0005544118 systemd[1]: 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b-3d06f6b7be68472e.service: Main process exited, code=exited, status=1/FAILURE
Dec  3 09:12:08 np0005544118 systemd[1]: 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b-3d06f6b7be68472e.service: Failed with result 'exit-code'.
Dec  3 09:12:08 np0005544118 python3.9[197839]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:12:08 np0005544118 systemd[1]: Stopping podman_exporter container...
Dec  3 09:12:09 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:12:08 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3926 "" "Go-http-client/1.1"
Dec  3 09:12:09 np0005544118 systemd[1]: libpod-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.scope: Deactivated successfully.
Dec  3 09:12:09 np0005544118 podman[197844]: 2025-12-03 14:12:09.074234258 +0000 UTC m=+0.061391627 container died 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:12:09 np0005544118 systemd[1]: 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b-3d06f6b7be68472e.timer: Deactivated successfully.
Dec  3 09:12:09 np0005544118 systemd[1]: Stopped /usr/bin/podman healthcheck run 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.
Dec  3 09:12:09 np0005544118 systemd[1]: var-lib-containers-storage-overlay-cd45d40ef227d12be03b96d1a0715cd0c29fd1a4f9b84e1e86c16874d8e7076c-merged.mount: Deactivated successfully.
Dec  3 09:12:09 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b-userdata-shm.mount: Deactivated successfully.
Dec  3 09:12:09 np0005544118 podman[197844]: 2025-12-03 14:12:09.43204216 +0000 UTC m=+0.419199489 container cleanup 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:12:09 np0005544118 podman[197844]: podman_exporter
Dec  3 09:12:09 np0005544118 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  3 09:12:09 np0005544118 podman[197874]: podman_exporter
Dec  3 09:12:09 np0005544118 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec  3 09:12:09 np0005544118 systemd[1]: Stopped podman_exporter container.
Dec  3 09:12:09 np0005544118 systemd[1]: Starting podman_exporter container...
Dec  3 09:12:09 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:12:09 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd45d40ef227d12be03b96d1a0715cd0c29fd1a4f9b84e1e86c16874d8e7076c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:09 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cd45d40ef227d12be03b96d1a0715cd0c29fd1a4f9b84e1e86c16874d8e7076c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:09 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.
Dec  3 09:12:09 np0005544118 podman[197887]: 2025-12-03 14:12:09.650695501 +0000 UTC m=+0.127542706 container init 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.666Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.666Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.667Z caller=handler.go:94 level=info msg="enabled collectors"
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.667Z caller=handler.go:105 level=info collector=container
Dec  3 09:12:09 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:12:09 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec  3 09:12:09 np0005544118 podman[197639]: time="2025-12-03T14:12:09Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:12:09 np0005544118 podman[197887]: 2025-12-03 14:12:09.672722387 +0000 UTC m=+0.149569572 container start 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:12:09 np0005544118 podman[197887]: podman_exporter
Dec  3 09:12:09 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:12:09 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 14067 "" "Go-http-client/1.1"
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.682Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.682Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec  3 09:12:09 np0005544118 systemd[1]: Started podman_exporter container.
Dec  3 09:12:09 np0005544118 podman_exporter[197902]: ts=2025-12-03T14:12:09.683Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec  3 09:12:09 np0005544118 podman[197911]: 2025-12-03 14:12:09.750891672 +0000 UTC m=+0.068595018 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:12:10 np0005544118 python3.9[198087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:12:10 np0005544118 python3.9[198210]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764771129.9039145-1115-98633852654218/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec  3 09:12:11 np0005544118 python3.9[198362]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec  3 09:12:12 np0005544118 python3.9[198516]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec  3 09:12:13 np0005544118 python3[198668]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec  3 09:12:14 np0005544118 podman[198708]: 2025-12-03 14:12:14.868507901 +0000 UTC m=+0.093136165 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller)
Dec  3 09:12:16 np0005544118 podman[198681]: 2025-12-03 14:12:16.249965296 +0000 UTC m=+2.724508637 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  3 09:12:16 np0005544118 podman[198801]: 2025-12-03 14:12:16.390099573 +0000 UTC m=+0.051715907 container create ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:12:16 np0005544118 podman[198801]: 2025-12-03 14:12:16.36568215 +0000 UTC m=+0.027298504 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  3 09:12:16 np0005544118 python3[198668]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec  3 09:12:18 np0005544118 python3.9[198991]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:12:19 np0005544118 python3.9[199145]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:19 np0005544118 auditd[704]: Audit daemon rotating log files
Dec  3 09:12:20 np0005544118 python3.9[199296]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764771139.6087058-1221-132348894996663/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:20 np0005544118 python3.9[199372]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec  3 09:12:20 np0005544118 systemd[1]: Reloading.
Dec  3 09:12:20 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:12:20 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:12:21 np0005544118 python3.9[199482]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec  3 09:12:21 np0005544118 systemd[1]: Reloading.
Dec  3 09:12:21 np0005544118 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec  3 09:12:21 np0005544118 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec  3 09:12:21 np0005544118 systemd[1]: Starting openstack_network_exporter container...
Dec  3 09:12:22 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:12:22 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:22 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:22 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:22 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.
Dec  3 09:12:22 np0005544118 podman[199523]: 2025-12-03 14:12:22.071743807 +0000 UTC m=+0.124431639 container init ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *bridge.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *coverage.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *datapath.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *iface.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *memory.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *ovnnorthd.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *ovn.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *ovsdbserver.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *pmd_perf.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *pmd_rxq.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: INFO    14:12:22 main.go:48: registering *vswitch.Collector
Dec  3 09:12:22 np0005544118 openstack_network_exporter[199539]: NOTICE  14:12:22 main.go:76: listening on https://:9105/metrics
Dec  3 09:12:22 np0005544118 podman[199523]: 2025-12-03 14:12:22.107297451 +0000 UTC m=+0.159985323 container start ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:12:22 np0005544118 podman[199523]: openstack_network_exporter
Dec  3 09:12:22 np0005544118 systemd[1]: Started openstack_network_exporter container.
Dec  3 09:12:22 np0005544118 podman[199544]: 2025-12-03 14:12:22.204280441 +0000 UTC m=+0.090515850 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  3 09:12:23 np0005544118 python3.9[199724]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec  3 09:12:23 np0005544118 systemd[1]: Stopping openstack_network_exporter container...
Dec  3 09:12:23 np0005544118 systemd[1]: libpod-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.scope: Deactivated successfully.
Dec  3 09:12:23 np0005544118 podman[199728]: 2025-12-03 14:12:23.292056947 +0000 UTC m=+0.101962142 container died ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal)
Dec  3 09:12:23 np0005544118 systemd[1]: ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833-28a704bb9c492079.timer: Deactivated successfully.
Dec  3 09:12:23 np0005544118 systemd[1]: Stopped /usr/bin/podman healthcheck run ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.
Dec  3 09:12:23 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833-userdata-shm.mount: Deactivated successfully.
Dec  3 09:12:23 np0005544118 systemd[1]: var-lib-containers-storage-overlay-487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1-merged.mount: Deactivated successfully.
Dec  3 09:12:24 np0005544118 podman[199728]: 2025-12-03 14:12:24.586706705 +0000 UTC m=+1.396611860 container cleanup ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:12:24 np0005544118 podman[199728]: openstack_network_exporter
Dec  3 09:12:24 np0005544118 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec  3 09:12:24 np0005544118 podman[199757]: openstack_network_exporter
Dec  3 09:12:24 np0005544118 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec  3 09:12:24 np0005544118 systemd[1]: Stopped openstack_network_exporter container.
Dec  3 09:12:24 np0005544118 systemd[1]: Starting openstack_network_exporter container...
Dec  3 09:12:24 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:12:24 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:24 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:24 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/487a8cecbe5887fe78bc1bb35394008837771257b197c4516dd72364c99433d1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec  3 09:12:24 np0005544118 systemd[1]: Started /usr/bin/podman healthcheck run ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.
Dec  3 09:12:24 np0005544118 podman[199770]: 2025-12-03 14:12:24.795157962 +0000 UTC m=+0.121175308 container init ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *bridge.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *coverage.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *datapath.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *iface.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *memory.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *ovnnorthd.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *ovn.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *ovsdbserver.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *pmd_perf.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *pmd_rxq.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: INFO    14:12:24 main.go:48: registering *vswitch.Collector
Dec  3 09:12:24 np0005544118 openstack_network_exporter[199786]: NOTICE  14:12:24 main.go:76: listening on https://:9105/metrics
Dec  3 09:12:24 np0005544118 podman[199770]: 2025-12-03 14:12:24.820643034 +0000 UTC m=+0.146660380 container start ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:12:24 np0005544118 podman[199770]: openstack_network_exporter
Dec  3 09:12:24 np0005544118 systemd[1]: Started openstack_network_exporter container.
Dec  3 09:12:24 np0005544118 podman[199796]: 2025-12-03 14:12:24.913665094 +0000 UTC m=+0.083942967 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec  3 09:12:25 np0005544118 python3.9[199970]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec  3 09:12:28 np0005544118 podman[199995]: 2025-12-03 14:12:28.842826834 +0000 UTC m=+0.068602169 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  3 09:12:38 np0005544118 podman[200015]: 2025-12-03 14:12:38.841728354 +0000 UTC m=+0.074909016 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  3 09:12:40 np0005544118 podman[200034]: 2025-12-03 14:12:40.840647038 +0000 UTC m=+0.064862644 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:12:44 np0005544118 python3.9[200185]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec  3 09:12:45 np0005544118 podman[200322]: 2025-12-03 14:12:45.143490136 +0000 UTC m=+0.116414240 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:12:45 np0005544118 python3.9[200371]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:45 np0005544118 systemd[1]: Started libpod-conmon-6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1.scope.
Dec  3 09:12:45 np0005544118 podman[200377]: 2025-12-03 14:12:45.416439212 +0000 UTC m=+0.094742845 container exec 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:12:45 np0005544118 podman[200377]: 2025-12-03 14:12:45.447928394 +0000 UTC m=+0.126232067 container exec_died 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:12:45 np0005544118 systemd[1]: libpod-conmon-6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1.scope: Deactivated successfully.
Dec  3 09:12:46 np0005544118 python3.9[200559]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:46 np0005544118 systemd[1]: Started libpod-conmon-6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1.scope.
Dec  3 09:12:46 np0005544118 podman[200560]: 2025-12-03 14:12:46.338371909 +0000 UTC m=+0.084355384 container exec 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:12:46 np0005544118 podman[200560]: 2025-12-03 14:12:46.367810515 +0000 UTC m=+0.113793970 container exec_died 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:12:46 np0005544118 systemd[1]: libpod-conmon-6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1.scope: Deactivated successfully.
Dec  3 09:12:47 np0005544118 python3.9[200741]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:47 np0005544118 python3.9[200893]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec  3 09:12:48 np0005544118 python3.9[201058]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:48 np0005544118 systemd[1]: Started libpod-conmon-f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c.scope.
Dec  3 09:12:48 np0005544118 podman[201059]: 2025-12-03 14:12:48.619764584 +0000 UTC m=+0.074167619 container exec f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent)
Dec  3 09:12:48 np0005544118 podman[201059]: 2025-12-03 14:12:48.657812653 +0000 UTC m=+0.112215618 container exec_died f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:12:48 np0005544118 systemd[1]: libpod-conmon-f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c.scope: Deactivated successfully.
Dec  3 09:12:49 np0005544118 python3.9[201241]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:49 np0005544118 systemd[1]: Started libpod-conmon-f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c.scope.
Dec  3 09:12:49 np0005544118 podman[201242]: 2025-12-03 14:12:49.417166971 +0000 UTC m=+0.059731808 container exec f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent)
Dec  3 09:12:49 np0005544118 podman[201242]: 2025-12-03 14:12:49.450712809 +0000 UTC m=+0.093277646 container exec_died f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  3 09:12:49 np0005544118 systemd[1]: libpod-conmon-f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c.scope: Deactivated successfully.
Dec  3 09:12:50 np0005544118 python3.9[201426]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:50 np0005544118 python3.9[201578]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec  3 09:12:51 np0005544118 python3.9[201743]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:51 np0005544118 systemd[1]: Started libpod-conmon-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope.
Dec  3 09:12:51 np0005544118 podman[201744]: 2025-12-03 14:12:51.574411196 +0000 UTC m=+0.078678620 container exec 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  3 09:12:51 np0005544118 podman[201744]: 2025-12-03 14:12:51.61006549 +0000 UTC m=+0.114332864 container exec_died 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:12:51 np0005544118 systemd[1]: libpod-conmon-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope: Deactivated successfully.
Dec  3 09:12:52 np0005544118 python3.9[201928]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:52 np0005544118 systemd[1]: Started libpod-conmon-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope.
Dec  3 09:12:52 np0005544118 podman[201929]: 2025-12-03 14:12:52.36173218 +0000 UTC m=+0.090836189 container exec 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Dec  3 09:12:52 np0005544118 podman[201929]: 2025-12-03 14:12:52.396894992 +0000 UTC m=+0.125998971 container exec_died 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  3 09:12:52 np0005544118 systemd[1]: libpod-conmon-6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115.scope: Deactivated successfully.
Dec  3 09:12:53 np0005544118 python3.9[202111]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:53 np0005544118 python3.9[202263]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec  3 09:12:54 np0005544118 python3.9[202428]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:54 np0005544118 systemd[1]: Started libpod-conmon-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.scope.
Dec  3 09:12:54 np0005544118 podman[202429]: 2025-12-03 14:12:54.643331951 +0000 UTC m=+0.088890617 container exec 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:12:54 np0005544118 podman[202429]: 2025-12-03 14:12:54.675002907 +0000 UTC m=+0.120561543 container exec_died 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:12:54 np0005544118 systemd[1]: libpod-conmon-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.scope: Deactivated successfully.
Dec  3 09:12:55 np0005544118 podman[202583]: 2025-12-03 14:12:55.223687005 +0000 UTC m=+0.084105657 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, name=ubi9-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  3 09:12:55 np0005544118 python3.9[202629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:55 np0005544118 systemd[1]: Started libpod-conmon-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.scope.
Dec  3 09:12:55 np0005544118 podman[202634]: 2025-12-03 14:12:55.821586194 +0000 UTC m=+0.434595321 container exec 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:12:55 np0005544118 podman[202634]: 2025-12-03 14:12:55.852370627 +0000 UTC m=+0.465379714 container exec_died 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:12:55 np0005544118 systemd[1]: libpod-conmon-910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b.scope: Deactivated successfully.
Dec  3 09:12:56 np0005544118 python3.9[202817]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:57 np0005544118 python3.9[202969]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec  3 09:12:57 np0005544118 python3.9[203134]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:57 np0005544118 systemd[1]: Started libpod-conmon-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.scope.
Dec  3 09:12:57 np0005544118 podman[203135]: 2025-12-03 14:12:57.959047414 +0000 UTC m=+0.072730320 container exec ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Dec  3 09:12:57 np0005544118 podman[203135]: 2025-12-03 14:12:57.966972327 +0000 UTC m=+0.080655243 container exec_died ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:12:58 np0005544118 systemd[1]: libpod-conmon-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.scope: Deactivated successfully.
Dec  3 09:12:58 np0005544118 python3.9[203320]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec  3 09:12:58 np0005544118 systemd[1]: Started libpod-conmon-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.scope.
Dec  3 09:12:58 np0005544118 podman[203321]: 2025-12-03 14:12:58.698502153 +0000 UTC m=+0.082980006 container exec ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec  3 09:12:58 np0005544118 podman[203340]: 2025-12-03 14:12:58.760798228 +0000 UTC m=+0.051848194 container exec_died ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=)
Dec  3 09:12:58 np0005544118 podman[203321]: 2025-12-03 14:12:58.768477657 +0000 UTC m=+0.152955520 container exec_died ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6)
Dec  3 09:12:58 np0005544118 systemd[1]: libpod-conmon-ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833.scope: Deactivated successfully.
Dec  3 09:12:59 np0005544118 podman[203476]: 2025-12-03 14:12:59.316320261 +0000 UTC m=+0.056921682 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Dec  3 09:12:59 np0005544118 python3.9[203524]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:12:59 np0005544118 nova_compute[187283]: 2025-12-03 14:12:59.914 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:12:59 np0005544118 nova_compute[187283]: 2025-12-03 14:12:59.975 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:12:59 np0005544118 nova_compute[187283]: 2025-12-03 14:12:59.976 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:12:59 np0005544118 nova_compute[187283]: 2025-12-03 14:12:59.976 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:13:00 np0005544118 nova_compute[187283]: 2025-12-03 14:13:00.026 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:13:00 np0005544118 nova_compute[187283]: 2025-12-03 14:13:00.027 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:00 np0005544118 python3.9[203677]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:00 np0005544118 nova_compute[187283]: 2025-12-03 14:13:00.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:00 np0005544118 nova_compute[187283]: 2025-12-03 14:13:00.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:00 np0005544118 python3.9[203829]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:13:00.939 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:13:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:13:00.940 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:13:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:13:00.940 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:13:01 np0005544118 python3.9[203952]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764771180.4607131-1651-35301363909900/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.635 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.797 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.798 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6017MB free_disk=73.3711051940918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.798 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.798 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.926 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.926 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.948 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.962 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.964 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:13:01 np0005544118 nova_compute[187283]: 2025-12-03 14:13:01.964 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:13:02 np0005544118 python3.9[204104]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:02 np0005544118 python3.9[204256]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:03 np0005544118 python3.9[204334]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:04 np0005544118 python3.9[204486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:04 np0005544118 python3.9[204564]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.df3sxbcn recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:05 np0005544118 python3.9[204716]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:05 np0005544118 python3.9[204794]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:06 np0005544118 python3.9[204946]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:13:07 np0005544118 python3[205099]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec  3 09:13:08 np0005544118 python3.9[205251]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:08 np0005544118 python3.9[205329]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:09 np0005544118 podman[205453]: 2025-12-03 14:13:09.284521542 +0000 UTC m=+0.066261075 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:13:09 np0005544118 python3.9[205500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:09 np0005544118 python3.9[205578]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:10 np0005544118 python3.9[205730]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:11 np0005544118 podman[205780]: 2025-12-03 14:13:11.045158112 +0000 UTC m=+0.055424021 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:13:11 np0005544118 python3.9[205832]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:11 np0005544118 python3.9[205984]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:12 np0005544118 python3.9[206062]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:13 np0005544118 python3.9[206214]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec  3 09:13:13 np0005544118 python3.9[206339]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764771192.634926-1901-137388938005752/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:14 np0005544118 python3.9[206491]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:15 np0005544118 python3.9[206643]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:13:15 np0005544118 podman[206723]: 2025-12-03 14:13:15.872581053 +0000 UTC m=+0.103361138 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:13:16 np0005544118 python3.9[206825]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:16 np0005544118 python3.9[206977]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:13:17 np0005544118 python3.9[207130]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec  3 09:13:18 np0005544118 python3.9[207284]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec  3 09:13:19 np0005544118 python3.9[207439]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:13:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:13:19 np0005544118 systemd[1]: session-27.scope: Deactivated successfully.
Dec  3 09:13:19 np0005544118 systemd[1]: session-27.scope: Consumed 1min 19.874s CPU time.
Dec  3 09:13:19 np0005544118 systemd-logind[795]: Session 27 logged out. Waiting for processes to exit.
Dec  3 09:13:19 np0005544118 systemd-logind[795]: Removed session 27.
Dec  3 09:13:25 np0005544118 podman[207469]: 2025-12-03 14:13:25.82341767 +0000 UTC m=+0.059410309 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:13:29 np0005544118 podman[207492]: 2025-12-03 14:13:29.826993208 +0000 UTC m=+0.058931666 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  3 09:13:35 np0005544118 podman[197639]: time="2025-12-03T14:13:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:13:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:13:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:13:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:13:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2559 "" "Go-http-client/1.1"
Dec  3 09:13:39 np0005544118 podman[207515]: 2025-12-03 14:13:39.818404893 +0000 UTC m=+0.047473096 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:13:41 np0005544118 podman[207535]: 2025-12-03 14:13:41.842234008 +0000 UTC m=+0.066735067 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:13:46 np0005544118 podman[207562]: 2025-12-03 14:13:46.846272508 +0000 UTC m=+0.077975281 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:13:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:13:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:13:56 np0005544118 podman[207588]: 2025-12-03 14:13:56.81434588 +0000 UTC m=+0.052342723 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, config_id=edpm, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:13:59 np0005544118 nova_compute[187283]: 2025-12-03 14:13:59.965 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:13:59 np0005544118 nova_compute[187283]: 2025-12-03 14:13:59.966 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:13:59 np0005544118 nova_compute[187283]: 2025-12-03 14:13:59.966 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:14:00 np0005544118 podman[207609]: 2025-12-03 14:14:00.831720542 +0000 UTC m=+0.068678789 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible)
Dec  3 09:14:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:00.940 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:14:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:00.942 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:14:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:00.942 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.214 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.214 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.214 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.214 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.215 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.215 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.934 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.935 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.935 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:14:02 np0005544118 nova_compute[187283]: 2025-12-03 14:14:02.935 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.079 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.080 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6118MB free_disk=73.37077713012695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.080 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.080 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.167 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.167 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.187 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.202 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.203 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.203 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.596 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:03 np0005544118 nova_compute[187283]: 2025-12-03 14:14:03.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:14:10 np0005544118 podman[207629]: 2025-12-03 14:14:10.826486673 +0000 UTC m=+0.055932865 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Dec  3 09:14:12 np0005544118 podman[207649]: 2025-12-03 14:14:12.816405584 +0000 UTC m=+0.052675093 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:14:17 np0005544118 podman[207676]: 2025-12-03 14:14:17.880327816 +0000 UTC m=+0.102478712 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  3 09:14:27 np0005544118 podman[207704]: 2025-12-03 14:14:27.815318493 +0000 UTC m=+0.049896314 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec  3 09:14:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:29.412 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:14:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:29.413 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:14:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:14:29.414 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:14:31 np0005544118 podman[207725]: 2025-12-03 14:14:31.843188454 +0000 UTC m=+0.077409138 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125)
Dec  3 09:14:35 np0005544118 podman[197639]: time="2025-12-03T14:14:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:14:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:14:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:14:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:14:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Dec  3 09:14:41 np0005544118 podman[207748]: 2025-12-03 14:14:41.821634672 +0000 UTC m=+0.055749501 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:14:43 np0005544118 podman[207767]: 2025-12-03 14:14:43.83673392 +0000 UTC m=+0.065736135 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:14:48 np0005544118 podman[207791]: 2025-12-03 14:14:48.86650351 +0000 UTC m=+0.100481766 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:14:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:14:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:14:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:14:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:14:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:14:58 np0005544118 podman[207819]: 2025-12-03 14:14:58.853339527 +0000 UTC m=+0.090437331 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal)
Dec  3 09:14:59 np0005544118 nova_compute[187283]: 2025-12-03 14:14:59.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:14:59 np0005544118 nova_compute[187283]: 2025-12-03 14:14:59.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:14:59 np0005544118 nova_compute[187283]: 2025-12-03 14:14:59.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:15:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:15:00.942 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:15:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:15:00.943 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:15:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:15:00.943 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:15:01 np0005544118 nova_compute[187283]: 2025-12-03 14:15:01.677 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:15:01 np0005544118 nova_compute[187283]: 2025-12-03 14:15:01.677 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:01 np0005544118 nova_compute[187283]: 2025-12-03 14:15:01.677 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:01 np0005544118 nova_compute[187283]: 2025-12-03 14:15:01.677 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:02 np0005544118 nova_compute[187283]: 2025-12-03 14:15:02.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:02 np0005544118 podman[207840]: 2025-12-03 14:15:02.852431736 +0000 UTC m=+0.084685005 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Dec  3 09:15:05 np0005544118 podman[197639]: time="2025-12-03T14:15:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:15:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:15:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:15:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:15:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2566 "" "Go-http-client/1.1"
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.725 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.725 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.725 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.726 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.726 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.840 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.840 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.840 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.840 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.963 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.964 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6156MB free_disk=73.37463760375977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.965 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:15:05 np0005544118 nova_compute[187283]: 2025-12-03 14:15:05.965 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.026 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.027 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.045 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.068 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.069 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:15:06 np0005544118 nova_compute[187283]: 2025-12-03 14:15:06.070 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:15:09 np0005544118 nova_compute[187283]: 2025-12-03 14:15:09.064 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:12 np0005544118 podman[207860]: 2025-12-03 14:15:12.8388855 +0000 UTC m=+0.074887086 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  3 09:15:14 np0005544118 podman[207880]: 2025-12-03 14:15:14.838367243 +0000 UTC m=+0.067673400 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:15:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:15:19 np0005544118 podman[207904]: 2025-12-03 14:15:19.845297571 +0000 UTC m=+0.073422684 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:15:29 np0005544118 podman[207931]: 2025-12-03 14:15:29.855356019 +0000 UTC m=+0.093393304 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Dec  3 09:15:33 np0005544118 podman[207952]: 2025-12-03 14:15:33.830881097 +0000 UTC m=+0.064854529 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  3 09:15:35 np0005544118 podman[197639]: time="2025-12-03T14:15:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:15:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:15:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:15:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:15:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec  3 09:15:43 np0005544118 podman[207973]: 2025-12-03 14:15:43.833565963 +0000 UTC m=+0.060389491 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:15:45 np0005544118 podman[207993]: 2025-12-03 14:15:45.845724288 +0000 UTC m=+0.077224999 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:15:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:15:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:15:50 np0005544118 podman[208017]: 2025-12-03 14:15:50.846361703 +0000 UTC m=+0.083440986 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.627 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.627 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.628 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:15:59 np0005544118 nova_compute[187283]: 2025-12-03 14:15:59.650 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.662 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.662 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.662 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.681 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.681 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:00 np0005544118 nova_compute[187283]: 2025-12-03 14:16:00.681 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:00 np0005544118 podman[208044]: 2025-12-03 14:16:00.846302667 +0000 UTC m=+0.079158622 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:16:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:16:00.943 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:16:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:16:00.944 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:16:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:16:00.944 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:16:01 np0005544118 nova_compute[187283]: 2025-12-03 14:16:01.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:02 np0005544118 nova_compute[187283]: 2025-12-03 14:16:02.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.640 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.641 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.641 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.641 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.801 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.802 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6194MB free_disk=73.37463760375977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.802 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.803 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:16:04 np0005544118 podman[208065]: 2025-12-03 14:16:04.823344339 +0000 UTC m=+0.052245656 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.959 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.959 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:16:04 np0005544118 nova_compute[187283]: 2025-12-03 14:16:04.978 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.036 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.036 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.068 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.094 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.113 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.149 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.152 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:16:05 np0005544118 nova_compute[187283]: 2025-12-03 14:16:05.152 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:16:05 np0005544118 podman[197639]: time="2025-12-03T14:16:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:16:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:16:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:16:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:16:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2571 "" "Go-http-client/1.1"
Dec  3 09:16:06 np0005544118 nova_compute[187283]: 2025-12-03 14:16:06.152 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:06 np0005544118 nova_compute[187283]: 2025-12-03 14:16:06.153 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:16:06 np0005544118 nova_compute[187283]: 2025-12-03 14:16:06.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:06 np0005544118 nova_compute[187283]: 2025-12-03 14:16:06.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:16:14 np0005544118 podman[208083]: 2025-12-03 14:16:14.826983522 +0000 UTC m=+0.056960589 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:16:16 np0005544118 podman[208103]: 2025-12-03 14:16:16.832864066 +0000 UTC m=+0.058456521 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:16:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:16:21 np0005544118 podman[208129]: 2025-12-03 14:16:21.871319015 +0000 UTC m=+0.100783458 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:16:31 np0005544118 podman[208155]: 2025-12-03 14:16:31.819368006 +0000 UTC m=+0.056906646 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:16:35 np0005544118 podman[197639]: time="2025-12-03T14:16:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:16:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:16:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:16:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:16:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2569 "" "Go-http-client/1.1"
Dec  3 09:16:35 np0005544118 podman[208176]: 2025-12-03 14:16:35.81960465 +0000 UTC m=+0.057109603 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  3 09:16:45 np0005544118 podman[208196]: 2025-12-03 14:16:45.816049552 +0000 UTC m=+0.051077804 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:16:47 np0005544118 podman[208216]: 2025-12-03 14:16:47.851451522 +0000 UTC m=+0.073938513 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:16:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:16:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:16:52 np0005544118 podman[208242]: 2025-12-03 14:16:52.312126929 +0000 UTC m=+0.128312039 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Dec  3 09:17:00 np0005544118 nova_compute[187283]: 2025-12-03 14:17:00.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:17:00.947 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:17:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:17:00.948 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:17:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:17:00.948 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:17:01 np0005544118 nova_compute[187283]: 2025-12-03 14:17:01.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:02 np0005544118 nova_compute[187283]: 2025-12-03 14:17:02.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:02 np0005544118 nova_compute[187283]: 2025-12-03 14:17:02.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:17:02 np0005544118 nova_compute[187283]: 2025-12-03 14:17:02.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:17:02 np0005544118 nova_compute[187283]: 2025-12-03 14:17:02.626 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:17:02 np0005544118 nova_compute[187283]: 2025-12-03 14:17:02.627 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:02 np0005544118 podman[208269]: 2025-12-03 14:17:02.837046292 +0000 UTC m=+0.071405051 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public)
Dec  3 09:17:04 np0005544118 nova_compute[187283]: 2025-12-03 14:17:04.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.234 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:05 np0005544118 podman[197639]: time="2025-12-03T14:17:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:17:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:17:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:17:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:17:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2568 "" "Go-http-client/1.1"
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.785 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.785 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.786 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.786 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.934 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.935 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6190MB free_disk=73.37460708618164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.935 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:17:05 np0005544118 nova_compute[187283]: 2025-12-03 14:17:05.935 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.109 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.110 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.130 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.396 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.397 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:17:06 np0005544118 nova_compute[187283]: 2025-12-03 14:17:06.397 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.462s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:17:06 np0005544118 podman[208290]: 2025-12-03 14:17:06.817213655 +0000 UTC m=+0.056267714 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:17:07 np0005544118 nova_compute[187283]: 2025-12-03 14:17:07.393 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:07 np0005544118 nova_compute[187283]: 2025-12-03 14:17:07.393 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:07 np0005544118 nova_compute[187283]: 2025-12-03 14:17:07.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:17:07 np0005544118 nova_compute[187283]: 2025-12-03 14:17:07.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:17:16 np0005544118 podman[208312]: 2025-12-03 14:17:16.827321278 +0000 UTC m=+0.059569839 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  3 09:17:18 np0005544118 podman[208332]: 2025-12-03 14:17:18.835521588 +0000 UTC m=+0.072114202 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:17:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:17:22 np0005544118 podman[208356]: 2025-12-03 14:17:22.856462154 +0000 UTC m=+0.087037152 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:17:33 np0005544118 podman[208382]: 2025-12-03 14:17:33.816152158 +0000 UTC m=+0.049471888 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350)
Dec  3 09:17:35 np0005544118 podman[197639]: time="2025-12-03T14:17:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:17:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:17:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:17:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:17:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Dec  3 09:17:37 np0005544118 podman[208404]: 2025-12-03 14:17:37.839018582 +0000 UTC m=+0.068060233 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  3 09:17:47 np0005544118 podman[208425]: 2025-12-03 14:17:47.833411272 +0000 UTC m=+0.070282738 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:17:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:17:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:17:49 np0005544118 podman[208444]: 2025-12-03 14:17:49.837322567 +0000 UTC m=+0.069992669 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:17:53 np0005544118 podman[208468]: 2025-12-03 14:17:53.840987419 +0000 UTC m=+0.078753043 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  3 09:18:00 np0005544118 nova_compute[187283]: 2025-12-03 14:18:00.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:00.949 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:00.949 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:00.949 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:02 np0005544118 nova_compute[187283]: 2025-12-03 14:18:02.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:03 np0005544118 nova_compute[187283]: 2025-12-03 14:18:03.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:04 np0005544118 nova_compute[187283]: 2025-12-03 14:18:04.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:04 np0005544118 nova_compute[187283]: 2025-12-03 14:18:04.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:18:04 np0005544118 nova_compute[187283]: 2025-12-03 14:18:04.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:18:04 np0005544118 nova_compute[187283]: 2025-12-03 14:18:04.678 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:18:04 np0005544118 podman[208495]: 2025-12-03 14:18:04.824375501 +0000 UTC m=+0.056905592 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:05 np0005544118 podman[197639]: time="2025-12-03T14:18:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:18:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:18:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:18:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:18:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2567 "" "Go-http-client/1.1"
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.956 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.956 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.957 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:05 np0005544118 nova_compute[187283]: 2025-12-03 14:18:05.957 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.111 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.112 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6198MB free_disk=73.37462615966797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.112 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.112 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.226 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.226 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.246 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.286 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.287 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:18:06 np0005544118 nova_compute[187283]: 2025-12-03 14:18:06.288 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:07 np0005544118 nova_compute[187283]: 2025-12-03 14:18:07.287 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:07 np0005544118 nova_compute[187283]: 2025-12-03 14:18:07.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:08 np0005544118 nova_compute[187283]: 2025-12-03 14:18:08.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:18:08 np0005544118 nova_compute[187283]: 2025-12-03 14:18:08.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:18:08 np0005544118 podman[208518]: 2025-12-03 14:18:08.824387009 +0000 UTC m=+0.058126675 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=multipathd)
Dec  3 09:18:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:17.825 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:18:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:17.826 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:18:18 np0005544118 podman[208540]: 2025-12-03 14:18:18.849702595 +0000 UTC m=+0.077205219 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:18:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:18:20 np0005544118 podman[208561]: 2025-12-03 14:18:20.810168829 +0000 UTC m=+0.046074622 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:18:24 np0005544118 podman[208586]: 2025-12-03 14:18:24.867353635 +0000 UTC m=+0.101125691 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec  3 09:18:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:27.829 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:18:35 np0005544118 podman[197639]: time="2025-12-03T14:18:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:18:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:18:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:18:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:18:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2572 "" "Go-http-client/1.1"
Dec  3 09:18:35 np0005544118 podman[208612]: 2025-12-03 14:18:35.819637592 +0000 UTC m=+0.055333057 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  3 09:18:39 np0005544118 podman[208633]: 2025-12-03 14:18:39.841887304 +0000 UTC m=+0.068600824 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:18:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:18:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:18:49 np0005544118 podman[208653]: 2025-12-03 14:18:49.818317448 +0000 UTC m=+0.050655555 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.424 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.425 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.444 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.557 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.558 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.568 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.569 187287 INFO nova.compute.claims [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.700 187287 DEBUG nova.compute.provider_tree [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.718 187287 DEBUG nova.scheduler.client.report [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.748 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.748 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.812 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.813 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.838 187287 INFO nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.856 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.950 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.952 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.952 187287 INFO nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Creating image(s)#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.953 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.954 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.954 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.955 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:50 np0005544118 nova_compute[187283]: 2025-12-03 14:18:50.955 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:51 np0005544118 nova_compute[187283]: 2025-12-03 14:18:51.644 187287 WARNING oslo_policy.policy [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  3 09:18:51 np0005544118 nova_compute[187283]: 2025-12-03 14:18:51.645 187287 WARNING oslo_policy.policy [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec  3 09:18:51 np0005544118 nova_compute[187283]: 2025-12-03 14:18:51.647 187287 DEBUG nova.policy [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4376db45a5649bbab6eb86fb45a0248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a16e69f7b8f43529e0c039245ec148d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:18:51 np0005544118 podman[208673]: 2025-12-03 14:18:51.809445255 +0000 UTC m=+0.047619697 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.345 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.426 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.part --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.428 187287 DEBUG nova.virt.images [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] c4df1e47-ea6c-486a-a6b4-60f325b44502 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.429 187287 DEBUG nova.privsep.utils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.430 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.part /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.690 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.part /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.converted" returned: 0 in 0.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.696 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.749 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b.converted --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.750 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:52 np0005544118 nova_compute[187283]: 2025-12-03 14:18:52.764 187287 INFO oslo.privsep.daemon [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpf78lzbuv/privsep.sock']#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.397 187287 INFO oslo.privsep.daemon [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.299 208716 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.303 208716 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.306 208716 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.306 208716 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208716#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.488 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.580 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.581 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.582 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.601 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.656 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.657 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.693 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.694 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.695 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.766 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.767 187287 DEBUG nova.virt.disk.api [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Checking if we can resize image /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.767 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.837 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.838 187287 DEBUG nova.virt.disk.api [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Cannot resize image /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.839 187287 DEBUG nova.objects.instance [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'migration_context' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.858 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.859 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Ensure instance console log exists: /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.860 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.860 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.861 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:53 np0005544118 nova_compute[187283]: 2025-12-03 14:18:53.871 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Successfully created port: 27279299-81d4-46c8-a65e-40a61fe9ef64 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:18:55 np0005544118 podman[208733]: 2025-12-03 14:18:55.84867563 +0000 UTC m=+0.079729831 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS)
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.051 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Successfully updated port: 27279299-81d4-46c8-a65e-40a61fe9ef64 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.068 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.069 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.069 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.220 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.535 187287 DEBUG nova.compute.manager [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-changed-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.535 187287 DEBUG nova.compute.manager [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Refreshing instance network info cache due to event network-changed-27279299-81d4-46c8-a65e-40a61fe9ef64. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.535 187287 DEBUG oslo_concurrency.lockutils [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.952 187287 DEBUG nova.network.neutron [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.978 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.979 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance network_info: |[{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.979 187287 DEBUG oslo_concurrency.lockutils [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.979 187287 DEBUG nova.network.neutron [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Refreshing network info cache for port 27279299-81d4-46c8-a65e-40a61fe9ef64 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.983 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start _get_guest_xml network_info=[{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.989 187287 WARNING nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.994 187287 DEBUG nova.virt.libvirt.host [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:18:56 np0005544118 nova_compute[187283]: 2025-12-03 14:18:56.995 187287 DEBUG nova.virt.libvirt.host [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.001 187287 DEBUG nova.virt.libvirt.host [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.002 187287 DEBUG nova.virt.libvirt.host [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.004 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.004 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.005 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.005 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.006 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.006 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.006 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.006 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.007 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.007 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.007 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.007 187287 DEBUG nova.virt.hardware [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.012 187287 DEBUG nova.privsep.utils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.014 187287 DEBUG nova.virt.libvirt.vif [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:18:50Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.014 187287 DEBUG nova.network.os_vif_util [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.015 187287 DEBUG nova.network.os_vif_util [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.017 187287 DEBUG nova.objects.instance [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.040 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <uuid>7e534904-afa6-40ef-bb5a-ac4971f60d75</uuid>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <name>instance-00000001</name>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-479405706</nova:name>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:18:56</nova:creationTime>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:user uuid="a4376db45a5649bbab6eb86fb45a0248">tempest-TestExecuteActionsViaActuator-1110081854-project-member</nova:user>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:project uuid="8a16e69f7b8f43529e0c039245ec148d">tempest-TestExecuteActionsViaActuator-1110081854</nova:project>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        <nova:port uuid="27279299-81d4-46c8-a65e-40a61fe9ef64">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="serial">7e534904-afa6-40ef-bb5a-ac4971f60d75</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="uuid">7e534904-afa6-40ef-bb5a-ac4971f60d75</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:db:54:b8"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <target dev="tap27279299-81"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/console.log" append="off"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:18:57 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:18:57 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:18:57 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:18:57 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.041 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Preparing to wait for external event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.042 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.042 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.042 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.043 187287 DEBUG nova.virt.libvirt.vif [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:18:50Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.043 187287 DEBUG nova.network.os_vif_util [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.043 187287 DEBUG nova.network.os_vif_util [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.044 187287 DEBUG os_vif [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.076 187287 DEBUG ovsdbapp.backend.ovs_idl [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.077 187287 DEBUG ovsdbapp.backend.ovs_idl [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.077 187287 DEBUG ovsdbapp.backend.ovs_idl [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.077 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.079 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.081 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.089 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.089 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.090 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.090 187287 INFO oslo.privsep.daemon [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpshf4s_s0/privsep.sock']#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.791 187287 INFO oslo.privsep.daemon [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.664 208764 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.669 208764 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.671 208764 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec  3 09:18:57 np0005544118 nova_compute[187283]: 2025-12-03 14:18:57.671 208764 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208764#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.019 187287 DEBUG nova.network.neutron [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updated VIF entry in instance network info cache for port 27279299-81d4-46c8-a65e-40a61fe9ef64. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.020 187287 DEBUG nova.network.neutron [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.042 187287 DEBUG oslo_concurrency.lockutils [req-3c8cb4ef-ad49-41ad-83b5-4b19f1bbff1f req-d47ef597-2d31-4d28-94f0-537bec0bd9ee c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.114 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.114 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27279299-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.115 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27279299-81, col_values=(('external_ids', {'iface-id': '27279299-81d4-46c8-a65e-40a61fe9ef64', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:54:b8', 'vm-uuid': '7e534904-afa6-40ef-bb5a-ac4971f60d75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.117 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:58 np0005544118 NetworkManager[55710]: <info>  [1764771538.1189] manager: (tap27279299-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.119 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.126 187287 INFO os_vif [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81')#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.183 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.184 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.184 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No VIF found with MAC fa:16:3e:db:54:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.185 187287 INFO nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Using config drive#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.772 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.811 187287 INFO nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Creating config drive at /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.816 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9s7adrhl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:18:58 np0005544118 nova_compute[187283]: 2025-12-03 14:18:58.939 187287 DEBUG oslo_concurrency.processutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9s7adrhl" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:18:58 np0005544118 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec  3 09:18:58 np0005544118 kernel: tap27279299-81: entered promiscuous mode
Dec  3 09:18:59 np0005544118 ovn_controller[95637]: 2025-12-03T14:18:58Z|00027|binding|INFO|Claiming lport 27279299-81d4-46c8-a65e-40a61fe9ef64 for this chassis.
Dec  3 09:18:59 np0005544118 ovn_controller[95637]: 2025-12-03T14:18:58Z|00028|binding|INFO|27279299-81d4-46c8-a65e-40a61fe9ef64: Claiming fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:18:59 np0005544118 NetworkManager[55710]: <info>  [1764771539.0008] manager: (tap27279299-81): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.000 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.002 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.017 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.018 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 bound to our chassis#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.020 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.022 104491 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpbhthkocs/privsep.sock']#033[00m
Dec  3 09:18:59 np0005544118 systemd-udevd[208789]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:18:59 np0005544118 NetworkManager[55710]: <info>  [1764771539.0422] device (tap27279299-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:18:59 np0005544118 NetworkManager[55710]: <info>  [1764771539.0431] device (tap27279299-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:18:59 np0005544118 systemd-machined[153602]: New machine qemu-1-instance-00000001.
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.067 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:59 np0005544118 ovn_controller[95637]: 2025-12-03T14:18:59Z|00029|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 ovn-installed in OVS
Dec  3 09:18:59 np0005544118 ovn_controller[95637]: 2025-12-03T14:18:59Z|00030|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 up in Southbound
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.073 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:18:59 np0005544118 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.255 187287 DEBUG nova.compute.manager [req-b0703a18-b22b-4d4b-9ab3-620574a336b8 req-9cb2547f-7a82-4b0b-a2c2-af23332c5d3f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.255 187287 DEBUG oslo_concurrency.lockutils [req-b0703a18-b22b-4d4b-9ab3-620574a336b8 req-9cb2547f-7a82-4b0b-a2c2-af23332c5d3f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.256 187287 DEBUG oslo_concurrency.lockutils [req-b0703a18-b22b-4d4b-9ab3-620574a336b8 req-9cb2547f-7a82-4b0b-a2c2-af23332c5d3f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.256 187287 DEBUG oslo_concurrency.lockutils [req-b0703a18-b22b-4d4b-9ab3-620574a336b8 req-9cb2547f-7a82-4b0b-a2c2-af23332c5d3f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.256 187287 DEBUG nova.compute.manager [req-b0703a18-b22b-4d4b-9ab3-620574a336b8 req-9cb2547f-7a82-4b0b-a2c2-af23332c5d3f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Processing event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.529 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771539.5284948, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.529 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Started (Lifecycle Event)#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.532 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.544 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.549 187287 INFO nova.virt.libvirt.driver [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance spawned successfully.#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.549 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.566 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.571 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.575 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.575 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.576 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.576 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.577 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.577 187287 DEBUG nova.virt.libvirt.driver [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.602 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.603 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771539.5296926, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.603 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.627 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.630 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771539.5443883, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.630 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.635 187287 INFO nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Took 8.68 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.636 187287 DEBUG nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.658 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.661 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.691 104491 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.692 104491 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpbhthkocs/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.543 208813 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.548 208813 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.550 208813 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.551 208813 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208813#033[00m
Dec  3 09:18:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:18:59.696 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1aba37-5739-4a58-ad29-71c21fd2e184]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.697 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.708 187287 INFO nova.compute.manager [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Took 9.18 seconds to build instance.#033[00m
Dec  3 09:18:59 np0005544118 nova_compute[187283]: 2025-12-03 14:18:59.726 187287 DEBUG oslo_concurrency.lockutils [None req-cd35ccbe-5555-49d2-94cd-7db015f4e7c2 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.241 208813 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.241 208813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.241 208813 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.852 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc1a12d-8472-4a2e-bb5f-5cca46ed37bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.854 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08007569-21 in ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.857 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08007569-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.857 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[72c10fda-b949-4ba7-b18b-0b955d654f61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.862 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[72a3329a-26d7-413d-a0ee-3cadc62b3165]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.887 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[080a771d-eb54-46c4-a6d3-91b961372a18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.901 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e13d1172-e865-412f-812a-d9d1c78ef0e8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.903 104491 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpy4de4dn7/privsep.sock']#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.949 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.950 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:00.950 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.325 187287 DEBUG nova.compute.manager [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.325 187287 DEBUG oslo_concurrency.lockutils [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.325 187287 DEBUG oslo_concurrency.lockutils [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.325 187287 DEBUG oslo_concurrency.lockutils [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.326 187287 DEBUG nova.compute.manager [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:01 np0005544118 nova_compute[187283]: 2025-12-03 14:19:01.326 187287 WARNING nova.compute.manager [req-d76f4012-65b9-41fb-ac3d-4d31d5a9bb20 req-bcb6b33d-b334-4713-8e4a-ec15eff4c678 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.718 104491 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.719 104491 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpy4de4dn7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.495 208827 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.510 208827 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.512 208827 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.512 208827 INFO oslo.privsep.daemon [-] privsep daemon running as pid 208827#033[00m
Dec  3 09:19:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:01.722 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d416eac8-bfe7-48c8-ae34-07bb5f07c934]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:02.417 208827 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:02.417 208827 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:02 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:02.417 208827 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:02 np0005544118 nova_compute[187283]: 2025-12-03 14:19:02.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.118 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.166 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[38676071-de6d-48e9-82d4-4391151a4749]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.190 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc6af7bf-4df5-48b0-84f8-057b64411f7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 NetworkManager[55710]: <info>  [1764771543.1916] manager: (tap08007569-20): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Dec  3 09:19:03 np0005544118 systemd-udevd[208839]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.223 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[28769a2f-9cd0-4cc9-b5ca-97f661e37a7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.229 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[cab88588-cfcb-447f-a6c0-404ea07f7715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 NetworkManager[55710]: <info>  [1764771543.2583] device (tap08007569-20): carrier: link connected
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.263 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[1d77ba69-29b3-4f38-a4b0-07b96839e2af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.300 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[33801cfd-ed19-43d8-8294-0e01516e966d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369886, 'reachable_time': 16442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 208857, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.322 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4f5b1b-078c-4fa9-98da-c8d6d6d76948]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:c957'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369886, 'tstamp': 369886}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 208858, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.346 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5df8b93b-cbd5-4d9d-80fd-5ed59e9827e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369886, 'reachable_time': 16442, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 208859, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.373 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0494ea-4397-4ccd-9cfe-55b4d3c7a445]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.427 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7a2eea3e-8f0d-406f-b6cf-9abe94def83f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.428 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.429 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.429 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:03 np0005544118 NetworkManager[55710]: <info>  [1764771543.4320] manager: (tap08007569-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Dec  3 09:19:03 np0005544118 kernel: tap08007569-20: entered promiscuous mode
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.431 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.433 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.434 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:03 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:03Z|00031|binding|INFO|Releasing lport 445f8411-d8a5-4a44-8c5f-54cc69a35119 from this chassis (sb_readonly=0)
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.435 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.446 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.447 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.448 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[649f9118-d633-4df4-afd4-184363394d5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.450 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-08007569-27ca-4ce6-b140-d7ea7d6cd593
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 08007569-27ca-4ce6-b140-d7ea7d6cd593
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:19:03 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:03.450 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'env', 'PROCESS_TAG=haproxy-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08007569-27ca-4ce6-b140-d7ea7d6cd593.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:03 np0005544118 nova_compute[187283]: 2025-12-03 14:19:03.773 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:03 np0005544118 podman[208892]: 2025-12-03 14:19:03.883342052 +0000 UTC m=+0.067540054 container create 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:19:03 np0005544118 systemd[1]: Started libpod-conmon-8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437.scope.
Dec  3 09:19:03 np0005544118 podman[208892]: 2025-12-03 14:19:03.851184236 +0000 UTC m=+0.035382258 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:19:03 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:19:03 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36f452b4e822da6f57d072178b9effbdcad215b23dd1b613e88cf3e313965637/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:19:04 np0005544118 podman[208892]: 2025-12-03 14:19:04.000351073 +0000 UTC m=+0.184549095 container init 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  3 09:19:04 np0005544118 podman[208892]: 2025-12-03 14:19:04.00552118 +0000 UTC m=+0.189719182 container start 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:19:04 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [NOTICE]   (208911) : New worker (208913) forked
Dec  3 09:19:04 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [NOTICE]   (208911) : Loading success.
Dec  3 09:19:04 np0005544118 nova_compute[187283]: 2025-12-03 14:19:04.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:19:05 np0005544118 podman[197639]: time="2025-12-03T14:19:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:19:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:19:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:19:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:19:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3036 "" "Go-http-client/1.1"
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.775 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.775 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.776 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:19:05 np0005544118 nova_compute[187283]: 2025-12-03 14:19:05.776 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:06 np0005544118 nova_compute[187283]: 2025-12-03 14:19:06.804 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:06 np0005544118 nova_compute[187283]: 2025-12-03 14:19:06.830 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:06 np0005544118 nova_compute[187283]: 2025-12-03 14:19:06.830 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:19:06 np0005544118 nova_compute[187283]: 2025-12-03 14:19:06.830 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:06 np0005544118 podman[208922]: 2025-12-03 14:19:06.848073396 +0000 UTC m=+0.071835346 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.626 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.627 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.627 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.700 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.757 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.758 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.809 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.942 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.944 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.33953094482422GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.944 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:07 np0005544118 nova_compute[187283]: 2025-12-03 14:19:07.945 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.011 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 7e534904-afa6-40ef-bb5a-ac4971f60d75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.012 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.012 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.055 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.093 187287 ERROR nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [req-858f1500-ecad-46f1-bf55-b91bd89eb116] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 52e95542-7192-4eec-a5dc-18596ad73a72.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-858f1500-ecad-46f1-bf55-b91bd89eb116"}]}#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.108 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.122 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.125 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.125 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.205 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.243 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.283 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.329 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updated inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.330 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.330 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.359 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.360 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.415s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:08 np0005544118 nova_compute[187283]: 2025-12-03 14:19:08.775 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:09 np0005544118 nova_compute[187283]: 2025-12-03 14:19:09.356 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:09 np0005544118 nova_compute[187283]: 2025-12-03 14:19:09.378 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:09 np0005544118 nova_compute[187283]: 2025-12-03 14:19:09.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:09 np0005544118 nova_compute[187283]: 2025-12-03 14:19:09.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:19:09 np0005544118 nova_compute[187283]: 2025-12-03 14:19:09.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:19:10 np0005544118 podman[208951]: 2025-12-03 14:19:10.850627477 +0000 UTC m=+0.075019937 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:19:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:12Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:19:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:12Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:19:13 np0005544118 nova_compute[187283]: 2025-12-03 14:19:13.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:13 np0005544118 nova_compute[187283]: 2025-12-03 14:19:13.778 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:18 np0005544118 nova_compute[187283]: 2025-12-03 14:19:18.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:18 np0005544118 nova_compute[187283]: 2025-12-03 14:19:18.803 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:19:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:19:20 np0005544118 podman[208987]: 2025-12-03 14:19:20.845482696 +0000 UTC m=+0.061046520 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:19:22 np0005544118 podman[209008]: 2025-12-03 14:19:22.82665328 +0000 UTC m=+0.053179576 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:19:22 np0005544118 nova_compute[187283]: 2025-12-03 14:19:22.931 187287 DEBUG nova.compute.manager [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.036 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.037 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.057 187287 DEBUG nova.objects.instance [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'pci_requests' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.074 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.075 187287 INFO nova.compute.claims [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.075 187287 DEBUG nova.objects.instance [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'resources' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.089 187287 DEBUG nova.objects.instance [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'pci_devices' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.132 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.260 187287 INFO nova.compute.resource_tracker [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating resource usage from migration 2ec75982-36b0-436d-a75a-43ef1838e23b#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.320 187287 DEBUG nova.compute.provider_tree [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.337 187287 DEBUG nova.scheduler.client.report [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.361 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.361 187287 INFO nova.compute.manager [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Migrating#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.362 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.362 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.371 187287 INFO nova.compute.rpcapi [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.372 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.406 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.406 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.407 187287 DEBUG nova.network.neutron [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:19:23 np0005544118 nova_compute[187283]: 2025-12-03 14:19:23.806 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:25 np0005544118 nova_compute[187283]: 2025-12-03 14:19:25.633 187287 DEBUG nova.network.neutron [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:25 np0005544118 nova_compute[187283]: 2025-12-03 14:19:25.705 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:25 np0005544118 nova_compute[187283]: 2025-12-03 14:19:25.790 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Dec  3 09:19:25 np0005544118 nova_compute[187283]: 2025-12-03 14:19:25.794 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  3 09:19:26 np0005544118 podman[209032]: 2025-12-03 14:19:26.887724093 +0000 UTC m=+0.109809964 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:19:27 np0005544118 kernel: tap27279299-81 (unregistering): left promiscuous mode
Dec  3 09:19:27 np0005544118 NetworkManager[55710]: <info>  [1764771567.9702] device (tap27279299-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:19:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:27Z|00032|binding|INFO|Releasing lport 27279299-81d4-46c8-a65e-40a61fe9ef64 from this chassis (sb_readonly=0)
Dec  3 09:19:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:27Z|00033|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 down in Southbound
Dec  3 09:19:27 np0005544118 nova_compute[187283]: 2025-12-03 14:19:27.973 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:27Z|00034|binding|INFO|Removing iface tap27279299-81 ovn-installed in OVS
Dec  3 09:19:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:27.989 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:19:27 np0005544118 nova_compute[187283]: 2025-12-03 14:19:27.990 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:27.991 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:19:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:27.993 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08007569-27ca-4ce6-b140-d7ea7d6cd593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:27.995 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f9afed8f-71a0-4def-9439-2f39545b20a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:27.997 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 namespace which is not needed anymore#033[00m
Dec  3 09:19:28 np0005544118 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec  3 09:19:28 np0005544118 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.013s CPU time.
Dec  3 09:19:28 np0005544118 systemd-machined[153602]: Machine qemu-1-instance-00000001 terminated.
Dec  3 09:19:28 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [NOTICE]   (208911) : haproxy version is 2.8.14-c23fe91
Dec  3 09:19:28 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [NOTICE]   (208911) : path to executable is /usr/sbin/haproxy
Dec  3 09:19:28 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [WARNING]  (208911) : Exiting Master process...
Dec  3 09:19:28 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [ALERT]    (208911) : Current worker (208913) exited with code 143 (Terminated)
Dec  3 09:19:28 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[208907]: [WARNING]  (208911) : All workers exited. Exiting... (0)
Dec  3 09:19:28 np0005544118 systemd[1]: libpod-8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437.scope: Deactivated successfully.
Dec  3 09:19:28 np0005544118 podman[209082]: 2025-12-03 14:19:28.129860153 +0000 UTC m=+0.043906523 container died 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.133 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437-userdata-shm.mount: Deactivated successfully.
Dec  3 09:19:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay-36f452b4e822da6f57d072178b9effbdcad215b23dd1b613e88cf3e313965637-merged.mount: Deactivated successfully.
Dec  3 09:19:28 np0005544118 podman[209082]: 2025-12-03 14:19:28.171387881 +0000 UTC m=+0.085434241 container cleanup 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:19:28 np0005544118 systemd[1]: libpod-conmon-8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437.scope: Deactivated successfully.
Dec  3 09:19:28 np0005544118 podman[209112]: 2025-12-03 14:19:28.232749518 +0000 UTC m=+0.042783403 container remove 8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.237 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8215f772-1782-4314-8bbd-fea4296bef8f]: (4, ('Wed Dec  3 02:19:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 (8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437)\n8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437\nWed Dec  3 02:19:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 (8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437)\n8b2425cf09609805e584768adc6d80bbfeebfc90c28d7a9f97ce278e13661437\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.239 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e9cc44-620c-4fee-93e2-96dfbac8e675]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.240 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.241 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 kernel: tap08007569-20: left promiscuous mode
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.255 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.258 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[04f41c65-aec0-4062-975f-8400b4b9eac6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.272 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2a395add-2763-426a-832d-9c4303659582]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.272 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f0bc5ece-b49d-4e64-9b7b-64482ab4b947]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.285 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4f269baa-c241-40d4-a586-ab4683c04e22]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369876, 'reachable_time': 41084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209145, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 systemd[1]: run-netns-ovnmeta\x2d08007569\x2d27ca\x2d4ce6\x2db140\x2dd7ea7d6cd593.mount: Deactivated successfully.
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.298 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.299 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[071e3123-1a14-42c8-b020-cd660b50d470]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.732 187287 DEBUG nova.compute.manager [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.733 187287 DEBUG oslo_concurrency.lockutils [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.733 187287 DEBUG oslo_concurrency.lockutils [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.733 187287 DEBUG oslo_concurrency.lockutils [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.734 187287 DEBUG nova.compute.manager [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.734 187287 WARNING nova.compute.manager [req-7cee839f-4a59-408d-a45a-d798e8667835 req-a92607f2-da98-44ee-8938-a25f1044b9fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state active and task_state resize_migrating.#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.808 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.810 187287 INFO nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance shutdown successfully after 3 seconds.#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.816 187287 INFO nova.virt.libvirt.driver [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance destroyed successfully.#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.816 187287 DEBUG nova.virt.libvirt.vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:18:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:19:23Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.817 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.817 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.818 187287 DEBUG os_vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.820 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.820 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27279299-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.821 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.823 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.825 187287 INFO os_vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81')#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.829 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.869 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:19:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:28.870 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.871 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.899 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.900 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.952 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.953 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.983 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:28 np0005544118 nova_compute[187283]: 2025-12-03 14:19:28.985 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk.config /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.012 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk.config /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.013 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk.info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.030 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "cp -r /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_resize/disk.info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" returned: 0 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.617 187287 DEBUG nova.network.neutron [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.736 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.737 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.737 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.940 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.940 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:29 np0005544118 nova_compute[187283]: 2025-12-03 14:19:29.941 187287 DEBUG nova.network.neutron [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.829 187287 DEBUG nova.compute.manager [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.830 187287 DEBUG oslo_concurrency.lockutils [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.830 187287 DEBUG oslo_concurrency.lockutils [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.831 187287 DEBUG oslo_concurrency.lockutils [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.831 187287 DEBUG nova.compute.manager [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:30 np0005544118 nova_compute[187283]: 2025-12-03 14:19:30.831 187287 WARNING nova.compute.manager [req-35555f17-4670-4e12-8a0a-cca5852a0ce4 req-e1b233ab-4c0a-4dd2-a0fd-108bc12e37a0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state active and task_state resize_migrated.#033[00m
Dec  3 09:19:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:30.873 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:31 np0005544118 nova_compute[187283]: 2025-12-03 14:19:31.889 187287 DEBUG nova.network.neutron [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:31 np0005544118 nova_compute[187283]: 2025-12-03 14:19:31.917 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.022 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.024 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.025 187287 INFO nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Creating image(s)#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.027 187287 DEBUG nova.objects.instance [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.045 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.100 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.102 187287 DEBUG nova.virt.disk.api [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.103 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.162 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.164 187287 DEBUG nova.virt.disk.api [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.621 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.622 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Ensure instance console log exists: /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.622 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.623 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.623 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.625 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start _get_guest_xml network_info=[{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.632 187287 WARNING nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.643 187287 DEBUG nova.virt.libvirt.host [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.644 187287 DEBUG nova.virt.libvirt.host [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.649 187287 DEBUG nova.virt.libvirt.host [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.649 187287 DEBUG nova.virt.libvirt.host [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.651 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.651 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='f750bd6c-10d6-4121-bf9f-842530ebc76d',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.651 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.651 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.652 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.652 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.652 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.652 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.652 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.653 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.653 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.653 187287 DEBUG nova.virt.hardware [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.653 187287 DEBUG nova.objects.instance [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.685 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.737 187287 DEBUG oslo_concurrency.processutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.738 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.738 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.739 187287 DEBUG oslo_concurrency.lockutils [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.740 187287 DEBUG nova.virt.libvirt.vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:18:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:19:29Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.740 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.741 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.744 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <uuid>7e534904-afa6-40ef-bb5a-ac4971f60d75</uuid>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <name>instance-00000001</name>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <memory>196608</memory>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-479405706</nova:name>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:19:32</nova:creationTime>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.micro">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:memory>192</nova:memory>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:user uuid="a4376db45a5649bbab6eb86fb45a0248">tempest-TestExecuteActionsViaActuator-1110081854-project-member</nova:user>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:project uuid="8a16e69f7b8f43529e0c039245ec148d">tempest-TestExecuteActionsViaActuator-1110081854</nova:project>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        <nova:port uuid="27279299-81d4-46c8-a65e-40a61fe9ef64">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="serial">7e534904-afa6-40ef-bb5a-ac4971f60d75</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="uuid">7e534904-afa6-40ef-bb5a-ac4971f60d75</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk.config"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:db:54:b8"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <target dev="tap27279299-81"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/console.log" append="off"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:19:32 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:19:32 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:19:32 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:19:32 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.745 187287 DEBUG nova.virt.libvirt.vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:18:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:19:29Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.745 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:db:54:b8"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.746 187287 DEBUG nova.network.os_vif_util [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.746 187287 DEBUG os_vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.747 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.747 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.747 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.750 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.750 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27279299-81, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.750 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27279299-81, col_values=(('external_ids', {'iface-id': '27279299-81d4-46c8-a65e-40a61fe9ef64', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:db:54:b8', 'vm-uuid': '7e534904-afa6-40ef-bb5a-ac4971f60d75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.752 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 NetworkManager[55710]: <info>  [1764771572.7528] manager: (tap27279299-81): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.754 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.756 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.756 187287 INFO os_vif [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81')#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.910 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.910 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.911 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No VIF found with MAC fa:16:3e:db:54:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.911 187287 INFO nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Using config drive#033[00m
Dec  3 09:19:32 np0005544118 kernel: tap27279299-81: entered promiscuous mode
Dec  3 09:19:32 np0005544118 NetworkManager[55710]: <info>  [1764771572.9743] manager: (tap27279299-81): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Dec  3 09:19:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:32Z|00035|binding|INFO|Claiming lport 27279299-81d4-46c8-a65e-40a61fe9ef64 for this chassis.
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.973 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:32Z|00036|binding|INFO|27279299-81d4-46c8-a65e-40a61fe9ef64: Claiming fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.982 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.983 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 bound to our chassis#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.984 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.986 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:32Z|00037|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 ovn-installed in OVS
Dec  3 09:19:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:32Z|00038|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 up in Southbound
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.989 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 nova_compute[187283]: 2025-12-03 14:19:32.991 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.996 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ee194893-1e8d-4eb8-b70e-85b918e0b4e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.997 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap08007569-21 in ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.998 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap08007569-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:19:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.998 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f24d55-fe69-4bc6-b704-b2eaf03f3ce8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:32.999 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3f13f6dc-1ced-4324-aa92-660fa8022fb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.008 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3ff1bf-e911-4b7c-814e-d73ad20f7244]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 systemd-machined[153602]: New machine qemu-2-instance-00000001.
Dec  3 09:19:33 np0005544118 systemd[1]: Started Virtual Machine qemu-2-instance-00000001.
Dec  3 09:19:33 np0005544118 systemd-udevd[209185]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.032 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a850ac9a-dfad-46b9-9e5b-3f55850d8345]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 NetworkManager[55710]: <info>  [1764771573.0437] device (tap27279299-81): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:19:33 np0005544118 NetworkManager[55710]: <info>  [1764771573.0443] device (tap27279299-81): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.064 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[2f02d7f9-4267-44fe-b2bc-575d04f6844e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.070 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d02acf3f-f347-4bd4-b4c9-e0c6d1d18e48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 NetworkManager[55710]: <info>  [1764771573.0711] manager: (tap08007569-20): new Veth device (/org/freedesktop/NetworkManager/Devices/26)
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.099 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[7c1997fd-e047-4ba5-929d-578a988b2cb7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.102 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[258a733a-a2a8-40ae-a5ab-126037472881]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 NetworkManager[55710]: <info>  [1764771573.1236] device (tap08007569-20): carrier: link connected
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.129 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[88063389-54bb-4a26-b20f-569310b7d1bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.148 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[da9f8719-74ae-444a-8c30-27c03515429f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209215, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.164 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc40ced3-608e-451f-bf8f-3ecf97f628f2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5a:c957'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372873, 'tstamp': 372873}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209216, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.168 187287 DEBUG nova.compute.manager [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.168 187287 DEBUG oslo_concurrency.lockutils [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.169 187287 DEBUG oslo_concurrency.lockutils [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.169 187287 DEBUG oslo_concurrency.lockutils [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.169 187287 DEBUG nova.compute.manager [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.170 187287 WARNING nova.compute.manager [req-469f9b5f-d62e-4e2a-87b2-f100fe79496d req-7e24abf6-e9e9-4c64-85a6-cebad49cee51 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state active and task_state resize_finish.#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.182 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d884e710-6dc6-4260-bfb3-302934780618]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 209219, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.213 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4780a0fe-c536-4597-81c0-a9c1e4f07ded]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.260 187287 DEBUG nova.virt.libvirt.host [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Removed pending event for 7e534904-afa6-40ef-bb5a-ac4971f60d75 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.260 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771573.2598186, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.261 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.263 187287 DEBUG nova.compute.manager [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.267 187287 INFO nova.virt.libvirt.driver [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance running successfully.#033[00m
Dec  3 09:19:33 np0005544118 virtqemud[186958]: argument unsupported: QEMU guest agent is not configured
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.272 187287 DEBUG nova.virt.libvirt.guest [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.272 187287 DEBUG nova.virt.libvirt.driver [None req-b7cb2548-542f-417a-8071-98ebce48779d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.282 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.285 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.288 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[98b0182d-3b7d-4f4c-ae49-8fc8db685569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.290 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.290 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.290 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:33 np0005544118 NetworkManager[55710]: <info>  [1764771573.2930] manager: (tap08007569-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Dec  3 09:19:33 np0005544118 kernel: tap08007569-20: entered promiscuous mode
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.292 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.296 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:33Z|00039|binding|INFO|Releasing lport 445f8411-d8a5-4a44-8c5f-54cc69a35119 from this chassis (sb_readonly=0)
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.297 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.300 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.301 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[57e9ed91-7c6f-46fd-a5e1-47a140fcdf65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.302 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-08007569-27ca-4ce6-b140-d7ea7d6cd593
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/08007569-27ca-4ce6-b140-d7ea7d6cd593.pid.haproxy
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 08007569-27ca-4ce6-b140-d7ea7d6cd593
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:19:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:33.303 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'env', 'PROCESS_TAG=haproxy-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/08007569-27ca-4ce6-b140-d7ea7d6cd593.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.306 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.306 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771573.260792, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.307 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Started (Lifecycle Event)#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.309 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.351 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.355 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:19:33 np0005544118 podman[209254]: 2025-12-03 14:19:33.644673504 +0000 UTC m=+0.021426592 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:19:33 np0005544118 nova_compute[187283]: 2025-12-03 14:19:33.810 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:34 np0005544118 podman[209254]: 2025-12-03 14:19:34.087041 +0000 UTC m=+0.463794068 container create df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:19:34 np0005544118 systemd[1]: Started libpod-conmon-df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83.scope.
Dec  3 09:19:34 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:19:34 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a004e7f20ebedf3c7598d71aadb249e524a597540892f7c6aaad7b9ca7da471/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:19:34 np0005544118 podman[209254]: 2025-12-03 14:19:34.886499066 +0000 UTC m=+1.263252164 container init df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:19:34 np0005544118 podman[209254]: 2025-12-03 14:19:34.894517444 +0000 UTC m=+1.271270552 container start df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:19:34 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [NOTICE]   (209274) : New worker (209276) forked
Dec  3 09:19:34 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [NOTICE]   (209274) : Loading success.
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.246 187287 DEBUG nova.compute.manager [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.246 187287 DEBUG oslo_concurrency.lockutils [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.246 187287 DEBUG oslo_concurrency.lockutils [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.247 187287 DEBUG oslo_concurrency.lockutils [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.247 187287 DEBUG nova.compute.manager [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.247 187287 WARNING nova.compute.manager [req-f56dd4c8-692f-433a-9857-5fbff1edf3a0 req-53497bf5-a5ba-4301-adf6-ff86e3531759 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state resized and task_state None.#033[00m
Dec  3 09:19:35 np0005544118 podman[197639]: time="2025-12-03T14:19:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:19:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:19:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:19:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:19:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3043 "" "Go-http-client/1.1"
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.879 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.879 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:35 np0005544118 nova_compute[187283]: 2025-12-03 14:19:35.880 187287 DEBUG nova.compute.manager [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Going to confirm migration 1 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Dec  3 09:19:36 np0005544118 nova_compute[187283]: 2025-12-03 14:19:36.244 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:36 np0005544118 nova_compute[187283]: 2025-12-03 14:19:36.245 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:36 np0005544118 nova_compute[187283]: 2025-12-03 14:19:36.245 187287 DEBUG nova.network.neutron [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:19:36 np0005544118 nova_compute[187283]: 2025-12-03 14:19:36.245 187287 DEBUG nova.objects.instance [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'info_cache' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.753 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.768 187287 DEBUG nova.network.neutron [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.798 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.798 187287 DEBUG nova.objects.instance [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.817 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.817 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:37 np0005544118 podman[209285]: 2025-12-03 14:19:37.829456607 +0000 UTC m=+0.060234697 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, version=9.6)
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.900 187287 DEBUG nova.compute.provider_tree [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:19:37 np0005544118 nova_compute[187283]: 2025-12-03 14:19:37.914 187287 DEBUG nova.scheduler.client.report [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:19:38 np0005544118 nova_compute[187283]: 2025-12-03 14:19:38.124 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:38 np0005544118 nova_compute[187283]: 2025-12-03 14:19:38.307 187287 INFO nova.scheduler.client.report [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Deleted allocation for migration 2ec75982-36b0-436d-a75a-43ef1838e23b#033[00m
Dec  3 09:19:38 np0005544118 nova_compute[187283]: 2025-12-03 14:19:38.363 187287 DEBUG oslo_concurrency.lockutils [None req-4e514c1a-3915-4676-8321-116ec0468850 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 2.483s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:38 np0005544118 nova_compute[187283]: 2025-12-03 14:19:38.812 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:39 np0005544118 nova_compute[187283]: 2025-12-03 14:19:39.919 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:39 np0005544118 nova_compute[187283]: 2025-12-03 14:19:39.920 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:39 np0005544118 nova_compute[187283]: 2025-12-03 14:19:39.951 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.151 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.152 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.158 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.159 187287 INFO nova.compute.claims [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.280 187287 DEBUG nova.compute.provider_tree [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.296 187287 DEBUG nova.scheduler.client.report [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.492 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.493 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.540 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.540 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.558 187287 INFO nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.578 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.660 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.661 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.662 187287 INFO nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Creating image(s)#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.662 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.662 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.663 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.679 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.742 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.744 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.744 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.759 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.819 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.820 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.958 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk 1073741824" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.959 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:40 np0005544118 nova_compute[187283]: 2025-12-03 14:19:40.960 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.011 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.012 187287 DEBUG nova.virt.disk.api [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Checking if we can resize image /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.012 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.061 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.063 187287 DEBUG nova.virt.disk.api [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Cannot resize image /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.063 187287 DEBUG nova.objects.instance [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'migration_context' on Instance uuid 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.080 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.080 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Ensure instance console log exists: /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.081 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.081 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.081 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:41 np0005544118 nova_compute[187283]: 2025-12-03 14:19:41.642 187287 DEBUG nova.policy [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4376db45a5649bbab6eb86fb45a0248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a16e69f7b8f43529e0c039245ec148d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:19:41 np0005544118 podman[209323]: 2025-12-03 14:19:41.82172867 +0000 UTC m=+0.059807006 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:19:42 np0005544118 nova_compute[187283]: 2025-12-03 14:19:42.791 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:43 np0005544118 nova_compute[187283]: 2025-12-03 14:19:43.848 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:44 np0005544118 nova_compute[187283]: 2025-12-03 14:19:44.735 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Successfully created port: 1b6ad066-7586-4a73-8359-26ddcc7fe684 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.796 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Successfully updated port: 1b6ad066-7586-4a73-8359-26ddcc7fe684 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.813 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.813 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquired lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.813 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.899 187287 DEBUG nova.compute.manager [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-changed-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.900 187287 DEBUG nova.compute.manager [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Refreshing instance network info cache due to event network-changed-1b6ad066-7586-4a73-8359-26ddcc7fe684. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:19:46 np0005544118 nova_compute[187283]: 2025-12-03 14:19:46.900 187287 DEBUG oslo_concurrency.lockutils [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:19:47 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:47Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:19:47 np0005544118 nova_compute[187283]: 2025-12-03 14:19:47.642 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:19:47 np0005544118 nova_compute[187283]: 2025-12-03 14:19:47.793 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.848 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.890 187287 DEBUG nova.network.neutron [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updating instance_info_cache with network_info: [{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.920 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Releasing lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.920 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Instance network_info: |[{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.921 187287 DEBUG oslo_concurrency.lockutils [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.921 187287 DEBUG nova.network.neutron [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Refreshing network info cache for port 1b6ad066-7586-4a73-8359-26ddcc7fe684 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.924 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Start _get_guest_xml network_info=[{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.929 187287 WARNING nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.937 187287 DEBUG nova.virt.libvirt.host [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.937 187287 DEBUG nova.virt.libvirt.host [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.944 187287 DEBUG nova.virt.libvirt.host [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.945 187287 DEBUG nova.virt.libvirt.host [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.946 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.946 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.947 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.947 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.947 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.947 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.948 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.948 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.948 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.948 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.948 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.949 187287 DEBUG nova.virt.hardware [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.952 187287 DEBUG nova.virt.libvirt.vif [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-631160386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-631160386',id=3,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-j011ek2n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:19:40Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=5a133b9a-7b3b-4026-bc0f-c3b4a7587999,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.952 187287 DEBUG nova.network.os_vif_util [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.952 187287 DEBUG nova.network.os_vif_util [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.953 187287 DEBUG nova.objects.instance [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'pci_devices' on Instance uuid 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.974 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <uuid>5a133b9a-7b3b-4026-bc0f-c3b4a7587999</uuid>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <name>instance-00000003</name>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-631160386</nova:name>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:19:48</nova:creationTime>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:user uuid="a4376db45a5649bbab6eb86fb45a0248">tempest-TestExecuteActionsViaActuator-1110081854-project-member</nova:user>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:project uuid="8a16e69f7b8f43529e0c039245ec148d">tempest-TestExecuteActionsViaActuator-1110081854</nova:project>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        <nova:port uuid="1b6ad066-7586-4a73-8359-26ddcc7fe684">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="serial">5a133b9a-7b3b-4026-bc0f-c3b4a7587999</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="uuid">5a133b9a-7b3b-4026-bc0f-c3b4a7587999</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.config"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:35:af:46"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <target dev="tap1b6ad066-75"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/console.log" append="off"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:19:48 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:19:48 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:19:48 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:19:48 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.976 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Preparing to wait for external event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.976 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.977 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.977 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.978 187287 DEBUG nova.virt.libvirt.vif [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-631160386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-631160386',id=3,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-j011ek2n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:19:40Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=5a133b9a-7b3b-4026-bc0f-c3b4a7587999,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.978 187287 DEBUG nova.network.os_vif_util [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.979 187287 DEBUG nova.network.os_vif_util [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.979 187287 DEBUG os_vif [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.979 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.980 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.980 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.983 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.984 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b6ad066-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.984 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1b6ad066-75, col_values=(('external_ids', {'iface-id': '1b6ad066-7586-4a73-8359-26ddcc7fe684', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:35:af:46', 'vm-uuid': '5a133b9a-7b3b-4026-bc0f-c3b4a7587999'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.986 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 NetworkManager[55710]: <info>  [1764771588.9873] manager: (tap1b6ad066-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.988 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.993 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:48 np0005544118 nova_compute[187283]: 2025-12-03 14:19:48.994 187287 INFO os_vif [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75')#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.064 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.066 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.066 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No VIF found with MAC fa:16:3e:35:af:46, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.066 187287 INFO nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Using config drive#033[00m
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:19:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:19:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.759 187287 INFO nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Creating config drive at /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.config#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.765 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy_18ntvh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.888 187287 DEBUG oslo_concurrency.processutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy_18ntvh" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:19:49 np0005544118 kernel: tap1b6ad066-75: entered promiscuous mode
Dec  3 09:19:49 np0005544118 NetworkManager[55710]: <info>  [1764771589.9358] manager: (tap1b6ad066-75): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Dec  3 09:19:49 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:49Z|00040|binding|INFO|Claiming lport 1b6ad066-7586-4a73-8359-26ddcc7fe684 for this chassis.
Dec  3 09:19:49 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:49Z|00041|binding|INFO|1b6ad066-7586-4a73-8359-26ddcc7fe684: Claiming fa:16:3e:35:af:46 10.100.0.5
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.936 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.946 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:af:46 10.100.0.5'], port_security=['fa:16:3e:35:af:46 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5a133b9a-7b3b-4026-bc0f-c3b4a7587999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=1b6ad066-7586-4a73-8359-26ddcc7fe684) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.948 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 1b6ad066-7586-4a73-8359-26ddcc7fe684 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 bound to our chassis#033[00m
Dec  3 09:19:49 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:49Z|00042|binding|INFO|Setting lport 1b6ad066-7586-4a73-8359-26ddcc7fe684 ovn-installed in OVS
Dec  3 09:19:49 np0005544118 ovn_controller[95637]: 2025-12-03T14:19:49Z|00043|binding|INFO|Setting lport 1b6ad066-7586-4a73-8359-26ddcc7fe684 up in Southbound
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.949 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.949 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:19:49 np0005544118 nova_compute[187283]: 2025-12-03 14:19:49.952 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.965 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a0fca59e-b33d-4098-bfa8-64455f7d6057]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:49 np0005544118 systemd-udevd[209374]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:19:49 np0005544118 systemd-machined[153602]: New machine qemu-3-instance-00000003.
Dec  3 09:19:49 np0005544118 NetworkManager[55710]: <info>  [1764771589.9822] device (tap1b6ad066-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:19:49 np0005544118 NetworkManager[55710]: <info>  [1764771589.9833] device (tap1b6ad066-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:19:49 np0005544118 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.996 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[fa11ad21-ade2-4ef6-a8a9-2c93a34c13a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:49 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:49.999 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[dada643d-170f-4bfb-bfa2-adde58bb36b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.024 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[953dd90b-6bfa-4948-bd97-e28abc8da82a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.042 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e915d41a-e5ba-432a-b917-77370259a98c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209382, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.057 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[88358390-2f79-4274-8651-92d818a8effe]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372885, 'tstamp': 372885}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209387, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372889, 'tstamp': 372889}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209387, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.059 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.061 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.062 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.063 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.063 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:19:50 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:19:50.063 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.238 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771590.2378964, 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.238 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] VM Started (Lifecycle Event)#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.257 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.264 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771590.2390313, 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.264 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.290 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.294 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.317 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.738 187287 DEBUG nova.compute.manager [req-dd2165f3-6d1e-4a69-83b0-5d48cba6ff04 req-01fa1e6f-013a-4c89-8aaa-05f10cac712e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.739 187287 DEBUG oslo_concurrency.lockutils [req-dd2165f3-6d1e-4a69-83b0-5d48cba6ff04 req-01fa1e6f-013a-4c89-8aaa-05f10cac712e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.739 187287 DEBUG oslo_concurrency.lockutils [req-dd2165f3-6d1e-4a69-83b0-5d48cba6ff04 req-01fa1e6f-013a-4c89-8aaa-05f10cac712e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.739 187287 DEBUG oslo_concurrency.lockutils [req-dd2165f3-6d1e-4a69-83b0-5d48cba6ff04 req-01fa1e6f-013a-4c89-8aaa-05f10cac712e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.740 187287 DEBUG nova.compute.manager [req-dd2165f3-6d1e-4a69-83b0-5d48cba6ff04 req-01fa1e6f-013a-4c89-8aaa-05f10cac712e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Processing event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.740 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.744 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771590.7443404, 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.744 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.746 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.749 187287 INFO nova.virt.libvirt.driver [-] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Instance spawned successfully.#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.750 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.768 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.785 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.791 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.792 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.793 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.794 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.795 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.795 187287 DEBUG nova.virt.libvirt.driver [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.829 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.870 187287 INFO nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Took 10.21 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.870 187287 DEBUG nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.936 187287 INFO nova.compute.manager [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Took 10.80 seconds to build instance.#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.955 187287 DEBUG oslo_concurrency.lockutils [None req-e6ee6613-ac7a-4317-a8a1-46771b5c8f46 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.970 187287 DEBUG nova.network.neutron [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updated VIF entry in instance network info cache for port 1b6ad066-7586-4a73-8359-26ddcc7fe684. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.971 187287 DEBUG nova.network.neutron [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updating instance_info_cache with network_info: [{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:19:50 np0005544118 nova_compute[187283]: 2025-12-03 14:19:50.987 187287 DEBUG oslo_concurrency.lockutils [req-c8becf37-060b-4246-9271-500530be3b5e req-1372e608-ce41-46be-b67c-9ac04ae564b6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:19:51 np0005544118 podman[209396]: 2025-12-03 14:19:51.825473235 +0000 UTC m=+0.057105342 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.833 187287 DEBUG nova.compute.manager [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.834 187287 DEBUG oslo_concurrency.lockutils [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.834 187287 DEBUG oslo_concurrency.lockutils [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.834 187287 DEBUG oslo_concurrency.lockutils [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.835 187287 DEBUG nova.compute.manager [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:19:52 np0005544118 nova_compute[187283]: 2025-12-03 14:19:52.835 187287 WARNING nova.compute.manager [req-b8aed276-749e-420e-b98e-e07eb0688e0b req-bcec337c-4a2a-4325-a2f3-0a2a2151ccdb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:19:53 np0005544118 podman[209416]: 2025-12-03 14:19:53.828278348 +0000 UTC m=+0.055791437 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:19:53 np0005544118 nova_compute[187283]: 2025-12-03 14:19:53.851 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:53 np0005544118 nova_compute[187283]: 2025-12-03 14:19:53.987 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:57 np0005544118 podman[209440]: 2025-12-03 14:19:57.911825341 +0000 UTC m=+0.126124927 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:19:58 np0005544118 nova_compute[187283]: 2025-12-03 14:19:58.853 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:19:58 np0005544118 nova_compute[187283]: 2025-12-03 14:19:58.988 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:00.950 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:00.951 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:00.952 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:03 np0005544118 nova_compute[187283]: 2025-12-03 14:20:03.855 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:03 np0005544118 nova_compute[187283]: 2025-12-03 14:20:03.989 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:04 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:04Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:35:af:46 10.100.0.5
Dec  3 09:20:04 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:04Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:35:af:46 10.100.0.5
Dec  3 09:20:04 np0005544118 nova_compute[187283]: 2025-12-03 14:20:04.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:04 np0005544118 nova_compute[187283]: 2025-12-03 14:20:04.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:20:05 np0005544118 podman[197639]: time="2025-12-03T14:20:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:20:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:20:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:20:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:20:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3047 "" "Go-http-client/1.1"
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.815 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.816 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.816 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:20:05 np0005544118 nova_compute[187283]: 2025-12-03 14:20:05.817 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.665 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [{"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.686 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-7e534904-afa6-40ef-bb5a-ac4971f60d75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.687 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.687 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.688 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.688 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.712 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.712 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.713 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.713 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.795 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:08 np0005544118 podman[209487]: 2025-12-03 14:20:08.816190149 +0000 UTC m=+0.057156193 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.851 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.852 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.868 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.910 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.916 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:08 np0005544118 nova_compute[187283]: 2025-12-03 14:20:08.991 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.066 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.067 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.117 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.296 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.297 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5542MB free_disk=73.28482818603516GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.297 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.298 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.374 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 7e534904-afa6-40ef-bb5a-ac4971f60d75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.375 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.375 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.375 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.431 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.445 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.467 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:20:09 np0005544118 nova_compute[187283]: 2025-12-03 14:20:09.467 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:11 np0005544118 nova_compute[187283]: 2025-12-03 14:20:11.387 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:11 np0005544118 nova_compute[187283]: 2025-12-03 14:20:11.388 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:11 np0005544118 nova_compute[187283]: 2025-12-03 14:20:11.388 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:20:11 np0005544118 nova_compute[187283]: 2025-12-03 14:20:11.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:12 np0005544118 podman[209522]: 2025-12-03 14:20:12.852042455 +0000 UTC m=+0.086670065 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  3 09:20:13 np0005544118 nova_compute[187283]: 2025-12-03 14:20:13.859 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:13 np0005544118 nova_compute[187283]: 2025-12-03 14:20:13.994 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.714 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.714 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.736 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.818 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.818 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.827 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.827 187287 INFO nova.compute.claims [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.969 187287 DEBUG nova.compute.provider_tree [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:20:16 np0005544118 nova_compute[187283]: 2025-12-03 14:20:16.983 187287 DEBUG nova.scheduler.client.report [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.009 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.010 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.092 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.092 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.116 187287 INFO nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.141 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.234 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.235 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.236 187287 INFO nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Creating image(s)#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.236 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.237 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.237 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.252 187287 DEBUG nova.policy [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4376db45a5649bbab6eb86fb45a0248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a16e69f7b8f43529e0c039245ec148d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.255 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.310 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.312 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.313 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.341 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.400 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.401 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.456 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk 1073741824" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.458 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.459 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.539 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.541 187287 DEBUG nova.virt.disk.api [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Checking if we can resize image /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.542 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.598 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.599 187287 DEBUG nova.virt.disk.api [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Cannot resize image /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.599 187287 DEBUG nova.objects.instance [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'migration_context' on Instance uuid 630952de-d907-4370-aed3-0bec512896a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.707 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.708 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Ensure instance console log exists: /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.708 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.709 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:17 np0005544118 nova_compute[187283]: 2025-12-03 14:20:17.709 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:18 np0005544118 nova_compute[187283]: 2025-12-03 14:20:18.167 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Successfully created port: 49af6e1a-ee13-46b0-b0b8-e99e240fb96d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:20:18 np0005544118 nova_compute[187283]: 2025-12-03 14:20:18.862 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:18 np0005544118 nova_compute[187283]: 2025-12-03 14:20:18.996 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.262 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Successfully updated port: 49af6e1a-ee13-46b0-b0b8-e99e240fb96d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.276 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.276 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquired lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.277 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.353 187287 DEBUG nova.compute.manager [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-changed-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.353 187287 DEBUG nova.compute.manager [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Refreshing instance network info cache due to event network-changed-49af6e1a-ee13-46b0-b0b8-e99e240fb96d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.353 187287 DEBUG oslo_concurrency.lockutils [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:20:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:20:19 np0005544118 nova_compute[187283]: 2025-12-03 14:20:19.634 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.311 187287 DEBUG nova.network.neutron [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating instance_info_cache with network_info: [{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.336 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Releasing lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.336 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance network_info: |[{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.336 187287 DEBUG oslo_concurrency.lockutils [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.337 187287 DEBUG nova.network.neutron [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Refreshing network info cache for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.339 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Start _get_guest_xml network_info=[{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.343 187287 WARNING nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.348 187287 DEBUG nova.virt.libvirt.host [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.348 187287 DEBUG nova.virt.libvirt.host [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.353 187287 DEBUG nova.virt.libvirt.host [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.354 187287 DEBUG nova.virt.libvirt.host [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.355 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.355 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.356 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.356 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.356 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.357 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.357 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.357 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.357 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.358 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.358 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.358 187287 DEBUG nova.virt.hardware [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.362 187287 DEBUG nova.virt.libvirt.vif [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1704045789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1704045789',id=5,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-ymg8wtcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:20:17Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=630952de-d907-4370-aed3-0bec512896a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.362 187287 DEBUG nova.network.os_vif_util [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.363 187287 DEBUG nova.network.os_vif_util [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.364 187287 DEBUG nova.objects.instance [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'pci_devices' on Instance uuid 630952de-d907-4370-aed3-0bec512896a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.378 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <uuid>630952de-d907-4370-aed3-0bec512896a1</uuid>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <name>instance-00000005</name>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteActionsViaActuator-server-1704045789</nova:name>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:20:20</nova:creationTime>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:user uuid="a4376db45a5649bbab6eb86fb45a0248">tempest-TestExecuteActionsViaActuator-1110081854-project-member</nova:user>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:project uuid="8a16e69f7b8f43529e0c039245ec148d">tempest-TestExecuteActionsViaActuator-1110081854</nova:project>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        <nova:port uuid="49af6e1a-ee13-46b0-b0b8-e99e240fb96d">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="serial">630952de-d907-4370-aed3-0bec512896a1</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="uuid">630952de-d907-4370-aed3-0bec512896a1</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:17:e7:b4"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <target dev="tap49af6e1a-ee"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/console.log" append="off"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:20:20 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:20:20 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:20:20 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:20:20 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.379 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Preparing to wait for external event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.379 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.380 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.380 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.381 187287 DEBUG nova.virt.libvirt.vif [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1704045789',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1704045789',id=5,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-ymg8wtcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:20:17Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=630952de-d907-4370-aed3-0bec512896a1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.381 187287 DEBUG nova.network.os_vif_util [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.381 187287 DEBUG nova.network.os_vif_util [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.382 187287 DEBUG os_vif [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.382 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.383 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.383 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.386 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.386 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49af6e1a-ee, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.387 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49af6e1a-ee, col_values=(('external_ids', {'iface-id': '49af6e1a-ee13-46b0-b0b8-e99e240fb96d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:e7:b4', 'vm-uuid': '630952de-d907-4370-aed3-0bec512896a1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.428 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 NetworkManager[55710]: <info>  [1764771620.4304] manager: (tap49af6e1a-ee): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.431 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.434 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.435 187287 INFO os_vif [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee')#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.479 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.479 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.480 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] No VIF found with MAC fa:16:3e:17:e7:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.480 187287 INFO nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Using config drive#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.751 187287 INFO nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Creating config drive at /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.761 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1guypmn8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.904 187287 DEBUG oslo_concurrency.processutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1guypmn8" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:20 np0005544118 kernel: tap49af6e1a-ee: entered promiscuous mode
Dec  3 09:20:20 np0005544118 NetworkManager[55710]: <info>  [1764771620.9682] manager: (tap49af6e1a-ee): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Dec  3 09:20:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:20Z|00044|binding|INFO|Claiming lport 49af6e1a-ee13-46b0-b0b8-e99e240fb96d for this chassis.
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.969 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:20Z|00045|binding|INFO|49af6e1a-ee13-46b0-b0b8-e99e240fb96d: Claiming fa:16:3e:17:e7:b4 10.100.0.4
Dec  3 09:20:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:20.984 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e7:b4 10.100.0.4'], port_security=['fa:16:3e:17:e7:b4 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '630952de-d907-4370-aed3-0bec512896a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=49af6e1a-ee13-46b0-b0b8-e99e240fb96d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:20:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:20Z|00046|binding|INFO|Setting lport 49af6e1a-ee13-46b0-b0b8-e99e240fb96d ovn-installed in OVS
Dec  3 09:20:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:20Z|00047|binding|INFO|Setting lport 49af6e1a-ee13-46b0-b0b8-e99e240fb96d up in Southbound
Dec  3 09:20:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:20.986 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 bound to our chassis#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.987 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:20.988 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:20:20 np0005544118 nova_compute[187283]: 2025-12-03 14:20:20.993 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:20 np0005544118 systemd-udevd[209574]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.007 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[98de574a-655d-48c4-866a-fc31cb5552b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 NetworkManager[55710]: <info>  [1764771621.0141] device (tap49af6e1a-ee): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:20:21 np0005544118 NetworkManager[55710]: <info>  [1764771621.0161] device (tap49af6e1a-ee): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:20:21 np0005544118 systemd-machined[153602]: New machine qemu-4-instance-00000005.
Dec  3 09:20:21 np0005544118 systemd[1]: Started Virtual Machine qemu-4-instance-00000005.
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.039 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[51649219-555b-4507-94ed-d07bea4859a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.042 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef16055-28e5-4ec9-b884-f971d247f1b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.072 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[767b5212-16d3-435f-bbd0-3029a86d5c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.090 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6fb2a3-2342-4b92-b348-4229dfd8ffe4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209590, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.112 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fb79cdf8-2e4e-4bed-8b5f-09deb673542e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372885, 'tstamp': 372885}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209591, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372889, 'tstamp': 372889}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209591, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.115 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.116 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.117 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.117 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.117 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.118 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:21.118 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.190 187287 DEBUG nova.compute.manager [req-66361f2c-d105-4661-b0de-a9929f0a0283 req-cc4dd085-bb5d-46fe-a075-c9c4d4e25304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.191 187287 DEBUG oslo_concurrency.lockutils [req-66361f2c-d105-4661-b0de-a9929f0a0283 req-cc4dd085-bb5d-46fe-a075-c9c4d4e25304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.191 187287 DEBUG oslo_concurrency.lockutils [req-66361f2c-d105-4661-b0de-a9929f0a0283 req-cc4dd085-bb5d-46fe-a075-c9c4d4e25304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.191 187287 DEBUG oslo_concurrency.lockutils [req-66361f2c-d105-4661-b0de-a9929f0a0283 req-cc4dd085-bb5d-46fe-a075-c9c4d4e25304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.191 187287 DEBUG nova.compute.manager [req-66361f2c-d105-4661-b0de-a9929f0a0283 req-cc4dd085-bb5d-46fe-a075-c9c4d4e25304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Processing event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.343 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771621.342855, 630952de-d907-4370-aed3-0bec512896a1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.344 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] VM Started (Lifecycle Event)#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.347 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.350 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.353 187287 INFO nova.virt.libvirt.driver [-] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance spawned successfully.#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.353 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.373 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.377 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.377 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.378 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.378 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.378 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.379 187287 DEBUG nova.virt.libvirt.driver [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.387 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.420 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.421 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771621.3432112, 630952de-d907-4370-aed3-0bec512896a1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.421 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.442 187287 DEBUG nova.network.neutron [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updated VIF entry in instance network info cache for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.442 187287 DEBUG nova.network.neutron [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating instance_info_cache with network_info: [{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.444 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.448 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771621.349268, 630952de-d907-4370-aed3-0bec512896a1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.448 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.453 187287 INFO nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Took 4.22 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.454 187287 DEBUG nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.468 187287 DEBUG oslo_concurrency.lockutils [req-2bcd43b5-21ae-4428-aeb9-0c163c6ff45a req-893c49c1-0bb2-4bb0-b5ec-7a62551c8f74 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.476 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.479 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.542 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.571 187287 INFO nova.compute.manager [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Took 4.78 seconds to build instance.#033[00m
Dec  3 09:20:21 np0005544118 nova_compute[187283]: 2025-12-03 14:20:21.589 187287 DEBUG oslo_concurrency.lockutils [None req-f547f056-f376-491c-a934-340603132fbb a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:22 np0005544118 podman[209600]: 2025-12-03 14:20:22.847258236 +0000 UTC m=+0.054932663 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.266 187287 DEBUG nova.compute.manager [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.266 187287 DEBUG oslo_concurrency.lockutils [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.266 187287 DEBUG oslo_concurrency.lockutils [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.267 187287 DEBUG oslo_concurrency.lockutils [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.267 187287 DEBUG nova.compute.manager [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] No waiting events found dispatching network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.267 187287 WARNING nova.compute.manager [req-ed0f922f-a481-42d0-807e-7e2f8ad8469a req-3b4f7ba0-2e49-44d5-848e-be52b9d5a2ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received unexpected event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d for instance with vm_state active and task_state None.#033[00m
Dec  3 09:20:23 np0005544118 nova_compute[187283]: 2025-12-03 14:20:23.863 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:24 np0005544118 podman[209619]: 2025-12-03 14:20:24.859756892 +0000 UTC m=+0.078730550 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:20:25 np0005544118 nova_compute[187283]: 2025-12-03 14:20:25.429 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:28 np0005544118 nova_compute[187283]: 2025-12-03 14:20:28.865 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:28 np0005544118 podman[209643]: 2025-12-03 14:20:28.86911248 +0000 UTC m=+0.105845527 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Dec  3 09:20:30 np0005544118 nova_compute[187283]: 2025-12-03 14:20:30.468 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:33 np0005544118 nova_compute[187283]: 2025-12-03 14:20:33.867 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:33 np0005544118 nova_compute[187283]: 2025-12-03 14:20:33.956 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:33.958 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:20:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:33.959 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:20:35 np0005544118 nova_compute[187283]: 2025-12-03 14:20:35.470 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:35Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:17:e7:b4 10.100.0.4
Dec  3 09:20:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:35Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:17:e7:b4 10.100.0.4
Dec  3 09:20:35 np0005544118 podman[197639]: time="2025-12-03T14:20:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:20:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:20:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:20:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:20:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Dec  3 09:20:38 np0005544118 nova_compute[187283]: 2025-12-03 14:20:38.869 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:39 np0005544118 podman[209683]: 2025-12-03 14:20:39.899582714 +0000 UTC m=+0.110266617 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:20:40 np0005544118 nova_compute[187283]: 2025-12-03 14:20:40.482 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:42.962 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:43 np0005544118 podman[209706]: 2025-12-03 14:20:43.832523556 +0000 UTC m=+0.063093595 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:20:43 np0005544118 nova_compute[187283]: 2025-12-03 14:20:43.872 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:45 np0005544118 nova_compute[187283]: 2025-12-03 14:20:45.484 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:45 np0005544118 nova_compute[187283]: 2025-12-03 14:20:45.563 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Check if temp file /var/lib/nova/instances/tmpwx3nxoh9 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  3 09:20:45 np0005544118 nova_compute[187283]: 2025-12-03 14:20:45.564 187287 DEBUG nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwx3nxoh9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a133b9a-7b3b-4026-bc0f-c3b4a7587999',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.607 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.625 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.626 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.626 187287 DEBUG nova.network.neutron [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.694 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.695 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:46 np0005544118 nova_compute[187283]: 2025-12-03 14:20:46.758 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:48 np0005544118 nova_compute[187283]: 2025-12-03 14:20:48.874 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:48 np0005544118 nova_compute[187283]: 2025-12-03 14:20:48.980 187287 DEBUG nova.network.neutron [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating instance_info_cache with network_info: [{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.009 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.155 187287 DEBUG nova.virt.libvirt.driver [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.156 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Creating file /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/b8a58d635a234a4a8da6e768d0ff0cd3.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.156 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/b8a58d635a234a4a8da6e768d0ff0cd3.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:20:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:20:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.599 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/b8a58d635a234a4a8da6e768d0ff0cd3.tmp" returned: 1 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.600 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/b8a58d635a234a4a8da6e768d0ff0cd3.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.600 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Creating directory /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.600 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.793 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1" returned: 0 in 0.193s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:49 np0005544118 nova_compute[187283]: 2025-12-03 14:20:49.798 187287 DEBUG nova.virt.libvirt.driver [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec  3 09:20:50 np0005544118 nova_compute[187283]: 2025-12-03 14:20:50.488 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:50 np0005544118 systemd[1]: Created slice User Slice of UID 42436.
Dec  3 09:20:50 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  3 09:20:50 np0005544118 systemd-logind[795]: New session 28 of user nova.
Dec  3 09:20:50 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  3 09:20:50 np0005544118 systemd[1]: Starting User Manager for UID 42436...
Dec  3 09:20:50 np0005544118 systemd[209752]: Queued start job for default target Main User Target.
Dec  3 09:20:50 np0005544118 systemd[209752]: Created slice User Application Slice.
Dec  3 09:20:50 np0005544118 systemd[209752]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:20:50 np0005544118 systemd[209752]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 09:20:50 np0005544118 systemd[209752]: Reached target Paths.
Dec  3 09:20:50 np0005544118 systemd[209752]: Reached target Timers.
Dec  3 09:20:50 np0005544118 systemd[209752]: Starting D-Bus User Message Bus Socket...
Dec  3 09:20:50 np0005544118 systemd[209752]: Starting Create User's Volatile Files and Directories...
Dec  3 09:20:50 np0005544118 systemd[209752]: Finished Create User's Volatile Files and Directories.
Dec  3 09:20:50 np0005544118 systemd[209752]: Listening on D-Bus User Message Bus Socket.
Dec  3 09:20:50 np0005544118 systemd[209752]: Reached target Sockets.
Dec  3 09:20:50 np0005544118 systemd[209752]: Reached target Basic System.
Dec  3 09:20:50 np0005544118 systemd[209752]: Reached target Main User Target.
Dec  3 09:20:50 np0005544118 systemd[209752]: Startup finished in 144ms.
Dec  3 09:20:50 np0005544118 systemd[1]: Started User Manager for UID 42436.
Dec  3 09:20:50 np0005544118 systemd[1]: Started Session 28 of User nova.
Dec  3 09:20:50 np0005544118 systemd[1]: session-28.scope: Deactivated successfully.
Dec  3 09:20:50 np0005544118 systemd-logind[795]: Session 28 logged out. Waiting for processes to exit.
Dec  3 09:20:50 np0005544118 systemd-logind[795]: Removed session 28.
Dec  3 09:20:52 np0005544118 kernel: tap49af6e1a-ee (unregistering): left promiscuous mode
Dec  3 09:20:52 np0005544118 NetworkManager[55710]: <info>  [1764771652.0038] device (tap49af6e1a-ee): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.011 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:52Z|00048|binding|INFO|Releasing lport 49af6e1a-ee13-46b0-b0b8-e99e240fb96d from this chassis (sb_readonly=0)
Dec  3 09:20:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:52Z|00049|binding|INFO|Setting lport 49af6e1a-ee13-46b0-b0b8-e99e240fb96d down in Southbound
Dec  3 09:20:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:52Z|00050|binding|INFO|Removing iface tap49af6e1a-ee ovn-installed in OVS
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.015 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.027 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.029 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:e7:b4 10.100.0.4'], port_security=['fa:16:3e:17:e7:b4 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '630952de-d907-4370-aed3-0bec512896a1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=49af6e1a-ee13-46b0-b0b8-e99e240fb96d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.030 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.031 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.047 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6b48b8ec-7df2-48f6-a3e7-57b7ae488626]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec  3 09:20:52 np0005544118 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000005.scope: Consumed 14.130s CPU time.
Dec  3 09:20:52 np0005544118 systemd-machined[153602]: Machine qemu-4-instance-00000005 terminated.
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.074 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[f514eb95-9134-47e8-9565-edfe4e6dd22a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.079 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ae65a4-1ea5-440b-b798-c15729463e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.079 187287 DEBUG nova.compute.manager [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.080 187287 DEBUG oslo_concurrency.lockutils [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.080 187287 DEBUG oslo_concurrency.lockutils [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.080 187287 DEBUG oslo_concurrency.lockutils [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.080 187287 DEBUG nova.compute.manager [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.080 187287 DEBUG nova.compute.manager [req-878c525b-ca0a-4f32-abdf-018164907c48 req-326acd57-66d8-4576-b3ad-c3e07cf40444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.105 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f18105-ea10-4ba6-ac3f-20c51fc8535f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.123 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[565640c3-0206-46f2-b72b-64eb865cf2ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209781, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.142 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1ff2e396-396b-4fb0-a8f6-0e93ffa4b14b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372885, 'tstamp': 372885}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209782, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372889, 'tstamp': 372889}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209782, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.144 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.146 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.150 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.151 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.151 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.151 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:52.152 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.816 187287 INFO nova.virt.libvirt.driver [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance shutdown successfully after 3 seconds.#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.825 187287 INFO nova.virt.libvirt.driver [-] [instance: 630952de-d907-4370-aed3-0bec512896a1] Instance destroyed successfully.#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.826 187287 DEBUG nova.virt.libvirt.vif [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1704045789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1704045789',id=5,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:20:21Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-ymg8wtcd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:20:45Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=630952de-d907-4370-aed3-0bec512896a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:17:e7:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.827 187287 DEBUG nova.network.os_vif_util [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "vif_mac": "fa:16:3e:17:e7:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.828 187287 DEBUG nova.network.os_vif_util [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.829 187287 DEBUG os_vif [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.831 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.832 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49af6e1a-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.834 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.837 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.841 187287 INFO os_vif [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee')#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.845 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.937 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:52 np0005544118 nova_compute[187283]: 2025-12-03 14:20:52.939 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.010 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.012 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk to 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.012 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.058 187287 DEBUG nova.compute.manager [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-unplugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.058 187287 DEBUG oslo_concurrency.lockutils [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.059 187287 DEBUG oslo_concurrency.lockutils [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.059 187287 DEBUG oslo_concurrency.lockutils [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.059 187287 DEBUG nova.compute.manager [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] No waiting events found dispatching network-vif-unplugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.059 187287 WARNING nova.compute.manager [req-f3cb0da8-5b56-4d5a-949c-2c0a4b7314fe req-f06aecd5-b37e-4c14-900d-18742f55d824 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received unexpected event network-vif-unplugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d for instance with vm_state active and task_state resize_migrating.#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.604 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk" returned: 0 in 0.591s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.604 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.605 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.811 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.config 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.config" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.812 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.812 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:20:53 np0005544118 podman[209811]: 2025-12-03 14:20:53.860816375 +0000 UTC m=+0.089679536 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  3 09:20:53 np0005544118 nova_compute[187283]: 2025-12-03 14:20:53.877 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.029 187287 DEBUG oslo_concurrency.processutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1_resize/disk.info 192.168.122.100:/var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk.info" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.664 187287 DEBUG neutronclient.v2_0.client [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.697 187287 DEBUG nova.compute.manager [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.698 187287 DEBUG oslo_concurrency.lockutils [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.698 187287 DEBUG oslo_concurrency.lockutils [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.698 187287 DEBUG oslo_concurrency.lockutils [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.698 187287 DEBUG nova.compute.manager [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.699 187287 WARNING nova.compute.manager [req-4d1c3502-1ef4-4ebc-a35e-10451856ddf4 req-6a274929-03d6-4b63-a5ec-72347e8ee688 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.748 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.748 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.749 187287 DEBUG oslo_concurrency.lockutils [None req-aca9be0d-9554-46be-8961-11cb07351d1f b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.820 187287 INFO nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Took 8.06 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.821 187287 DEBUG nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.840 187287 DEBUG nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpwx3nxoh9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='5a133b9a-7b3b-4026-bc0f-c3b4a7587999',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(764541e1-aa4e-4e28-8f4e-f18e536761c9),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.865 187287 DEBUG nova.objects.instance [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.867 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.869 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.869 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.886 187287 DEBUG nova.virt.libvirt.vif [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-631160386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-631160386',id=3,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:19:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-j011ek2n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:19:50Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=5a133b9a-7b3b-4026-bc0f-c3b4a7587999,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.886 187287 DEBUG nova.network.os_vif_util [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.887 187287 DEBUG nova.network.os_vif_util [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.888 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updating guest XML with vif config: <interface type="ethernet">
Dec  3 09:20:54 np0005544118 nova_compute[187283]:  <mac address="fa:16:3e:35:af:46"/>
Dec  3 09:20:54 np0005544118 nova_compute[187283]:  <model type="virtio"/>
Dec  3 09:20:54 np0005544118 nova_compute[187283]:  <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:20:54 np0005544118 nova_compute[187283]:  <mtu size="1442"/>
Dec  3 09:20:54 np0005544118 nova_compute[187283]:  <target dev="tap1b6ad066-75"/>
Dec  3 09:20:54 np0005544118 nova_compute[187283]: </interface>
Dec  3 09:20:54 np0005544118 nova_compute[187283]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  3 09:20:54 np0005544118 nova_compute[187283]: 2025-12-03 14:20:54.889 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.153 187287 DEBUG nova.compute.manager [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.155 187287 DEBUG oslo_concurrency.lockutils [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.155 187287 DEBUG oslo_concurrency.lockutils [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.155 187287 DEBUG oslo_concurrency.lockutils [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.155 187287 DEBUG nova.compute.manager [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] No waiting events found dispatching network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.155 187287 WARNING nova.compute.manager [req-634e0aef-004b-4577-8176-9d0e6bcc8724 req-742072df-8574-4f39-9f0f-9eda6bd8c6fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received unexpected event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d for instance with vm_state active and task_state resize_migrated.#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.372 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.373 187287 INFO nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.431 187287 INFO nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  3 09:20:55 np0005544118 podman[209833]: 2025-12-03 14:20:55.856304099 +0000 UTC m=+0.075651606 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.934 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:20:55 np0005544118 nova_compute[187283]: 2025-12-03 14:20:55.935 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.439 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.440 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.806 187287 DEBUG nova.compute.manager [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-changed-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.806 187287 DEBUG nova.compute.manager [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Refreshing instance network info cache due to event network-changed-1b6ad066-7586-4a73-8359-26ddcc7fe684. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.807 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.807 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.808 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Refreshing network info cache for port 1b6ad066-7586-4a73-8359-26ddcc7fe684 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.832 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771656.8318567, 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.833 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.852 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.857 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.883 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.942 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.942 187287 DEBUG nova.virt.libvirt.migration [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:20:56 np0005544118 kernel: tap1b6ad066-75 (unregistering): left promiscuous mode
Dec  3 09:20:56 np0005544118 NetworkManager[55710]: <info>  [1764771656.9756] device (tap1b6ad066-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.980 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:56Z|00051|binding|INFO|Releasing lport 1b6ad066-7586-4a73-8359-26ddcc7fe684 from this chassis (sb_readonly=0)
Dec  3 09:20:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:56Z|00052|binding|INFO|Setting lport 1b6ad066-7586-4a73-8359-26ddcc7fe684 down in Southbound
Dec  3 09:20:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:56Z|00053|binding|INFO|Removing iface tap1b6ad066-75 ovn-installed in OVS
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.984 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:56 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:56.990 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:35:af:46 10.100.0.5'], port_security=['fa:16:3e:35:af:46 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3a9d7e7b-04f9-4aed-a199-9003ff5fe58c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '5a133b9a-7b3b-4026-bc0f-c3b4a7587999', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=1b6ad066-7586-4a73-8359-26ddcc7fe684) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:20:56 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:56.992 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 1b6ad066-7586-4a73-8359-26ddcc7fe684 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:20:56 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:56.993 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 08007569-27ca-4ce6-b140-d7ea7d6cd593#033[00m
Dec  3 09:20:56 np0005544118 nova_compute[187283]: 2025-12-03 14:20:56.995 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.012 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2318b7-311c-412c-9524-69a13f94da9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec  3 09:20:57 np0005544118 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 15.293s CPU time.
Dec  3 09:20:57 np0005544118 systemd-machined[153602]: Machine qemu-3-instance-00000003 terminated.
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.046 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[fb439b1d-fe38-4a41-a686-af2a32d86cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.049 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d7839df0-bede-44e9-ba96-7f2f9af99b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.077 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d2d5368a-ef38-4dfe-b77c-25d2ccd34375]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.093 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9c493e81-b1d9-4d15-a13f-cb46ce0ae985]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap08007569-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5a:c9:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 11, 'rx_bytes': 784, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372873, 'reachable_time': 42731, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 209885, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.109 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[58fca831-d7f3-443d-bec1-c278120848e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372885, 'tstamp': 372885}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209886, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap08007569-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 372889, 'tstamp': 372889}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 209886, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.110 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.111 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.116 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.116 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap08007569-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.116 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.117 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap08007569-20, col_values=(('external_ids', {'iface-id': '445f8411-d8a5-4a44-8c5f-54cc69a35119'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:20:57.117 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.226 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.227 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.227 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.445 187287 DEBUG nova.virt.libvirt.guest [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '5a133b9a-7b3b-4026-bc0f-c3b4a7587999' (instance-00000003) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.446 187287 INFO nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migration operation has completed#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.447 187287 INFO nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] _post_live_migration() is started..#033[00m
Dec  3 09:20:57 np0005544118 nova_compute[187283]: 2025-12-03 14:20:57.835 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:20:57Z|00054|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.016 187287 DEBUG nova.compute.manager [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.016 187287 DEBUG oslo_concurrency.lockutils [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.016 187287 DEBUG oslo_concurrency.lockutils [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.017 187287 DEBUG oslo_concurrency.lockutils [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.017 187287 DEBUG nova.compute.manager [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.017 187287 DEBUG nova.compute.manager [req-85c50c87-4ebb-4b07-9c76-1e2b5f543506 req-5e7e4821-04c8-4fe1-b389-33aa84a455f7 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.514 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updated VIF entry in instance network info cache for port 1b6ad066-7586-4a73-8359-26ddcc7fe684. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.514 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Updating instance_info_cache with network_info: [{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.537 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-5a133b9a-7b3b-4026-bc0f-c3b4a7587999" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.538 187287 DEBUG nova.compute.manager [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-changed-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.538 187287 DEBUG nova.compute.manager [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Refreshing instance network info cache due to event network-changed-49af6e1a-ee13-46b0-b0b8-e99e240fb96d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.539 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.539 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.540 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Refreshing network info cache for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.559 187287 DEBUG nova.network.neutron [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Activated binding for port 1b6ad066-7586-4a73-8359-26ddcc7fe684 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.560 187287 DEBUG nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.562 187287 DEBUG nova.virt.libvirt.vif [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:19:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-631160386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-631160386',id=3,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:19:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-j011ek2n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:20:42Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=5a133b9a-7b3b-4026-bc0f-c3b4a7587999,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.562 187287 DEBUG nova.network.os_vif_util [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "address": "fa:16:3e:35:af:46", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1b6ad066-75", "ovs_interfaceid": "1b6ad066-7586-4a73-8359-26ddcc7fe684", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.564 187287 DEBUG nova.network.os_vif_util [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.564 187287 DEBUG os_vif [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.568 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.569 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b6ad066-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.572 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.575 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.577 187287 INFO os_vif [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:35:af:46,bridge_name='br-int',has_traffic_filtering=True,id=1b6ad066-7586-4a73-8359-26ddcc7fe684,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1b6ad066-75')#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.578 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.578 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.578 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.579 187287 DEBUG nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.579 187287 INFO nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Deleting instance files /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999_del#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.580 187287 INFO nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Deletion of /var/lib/nova/instances/5a133b9a-7b3b-4026-bc0f-c3b4a7587999_del complete#033[00m
Dec  3 09:20:58 np0005544118 nova_compute[187283]: 2025-12-03 14:20:58.880 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:20:59 np0005544118 nova_compute[187283]: 2025-12-03 14:20:59.610 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:20:59 np0005544118 nova_compute[187283]: 2025-12-03 14:20:59.611 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:20:59 np0005544118 podman[209905]: 2025-12-03 14:20:59.872004629 +0000 UTC m=+0.106556926 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.135 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.135 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.135 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.136 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.136 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.136 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.136 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.136 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.137 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.137 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.137 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.137 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-unplugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.137 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.138 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.138 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.138 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.138 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.138 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.139 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.139 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.139 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.139 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.139 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.140 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.140 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.140 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.140 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.140 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.141 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] No waiting events found dispatching network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.141 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Received unexpected event network-vif-plugged-1b6ad066-7586-4a73-8359-26ddcc7fe684 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.141 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.141 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.141 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] No waiting events found dispatching network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received unexpected event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d for instance with vm_state resized and task_state None.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.142 187287 DEBUG oslo_concurrency.lockutils [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.143 187287 DEBUG nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] No waiting events found dispatching network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.143 187287 WARNING nova.compute.manager [req-2a86ff63-750d-4f83-abbd-be02b16678ab req-e2d1054d-1fc9-47df-a408-fd942ffc4581 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Received unexpected event network-vif-plugged-49af6e1a-ee13-46b0-b0b8-e99e240fb96d for instance with vm_state resized and task_state None.#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.492 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updated VIF entry in instance network info cache for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.492 187287 DEBUG nova.network.neutron [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating instance_info_cache with network_info: [{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:21:00 np0005544118 nova_compute[187283]: 2025-12-03 14:21:00.511 187287 DEBUG oslo_concurrency.lockutils [req-a293d828-9f1f-4e70-a7a7-0f8658bac0ab req-343434aa-226a-40ed-b60a-70fe8fc29a37 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:21:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:00.952 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:00.952 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:00.953 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:00 np0005544118 systemd[1]: Stopping User Manager for UID 42436...
Dec  3 09:21:00 np0005544118 systemd[209752]: Activating special unit Exit the Session...
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped target Main User Target.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped target Basic System.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped target Paths.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped target Sockets.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped target Timers.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  3 09:21:00 np0005544118 systemd[209752]: Closed D-Bus User Message Bus Socket.
Dec  3 09:21:00 np0005544118 systemd[209752]: Stopped Create User's Volatile Files and Directories.
Dec  3 09:21:01 np0005544118 systemd[209752]: Removed slice User Application Slice.
Dec  3 09:21:01 np0005544118 systemd[209752]: Reached target Shutdown.
Dec  3 09:21:01 np0005544118 systemd[209752]: Finished Exit the Session.
Dec  3 09:21:01 np0005544118 systemd[209752]: Reached target Exit the Session.
Dec  3 09:21:01 np0005544118 systemd[1]: user@42436.service: Deactivated successfully.
Dec  3 09:21:01 np0005544118 systemd[1]: Stopped User Manager for UID 42436.
Dec  3 09:21:01 np0005544118 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  3 09:21:01 np0005544118 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  3 09:21:01 np0005544118 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  3 09:21:01 np0005544118 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  3 09:21:01 np0005544118 systemd[1]: Removed slice User Slice of UID 42436.
Dec  3 09:21:02 np0005544118 nova_compute[187283]: 2025-12-03 14:21:02.798 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "630952de-d907-4370-aed3-0bec512896a1" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:02 np0005544118 nova_compute[187283]: 2025-12-03 14:21:02.799 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:02 np0005544118 nova_compute[187283]: 2025-12-03 14:21:02.799 187287 DEBUG nova.compute.manager [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.573 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.881 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.939 187287 DEBUG neutronclient.v2_0.client [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 49af6e1a-ee13-46b0-b0b8-e99e240fb96d for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.940 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.940 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.941 187287 DEBUG nova.network.neutron [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:21:03 np0005544118 nova_compute[187283]: 2025-12-03 14:21:03.941 187287 DEBUG nova.objects.instance [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'info_cache' on Instance uuid 630952de-d907-4370-aed3-0bec512896a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:21:04 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.970 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:04 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.971 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:04 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.971 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "5a133b9a-7b3b-4026-bc0f-c3b4a7587999-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:04 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.998 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:04 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.999 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:04.999 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.000 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.079 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.153 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.155 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.207 187287 DEBUG oslo_concurrency.processutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.214 187287 WARNING nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Periodic task is updating the host stats, it is trying to get disk info for instance-00000005, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/630952de-d907-4370-aed3-0bec512896a1/disk#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.383 187287 WARNING nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.385 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5694MB free_disk=73.28545761108398GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.385 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.386 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.437 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration for instance 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.438 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration for instance 630952de-d907-4370-aed3-0bec512896a1 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.470 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.495 187287 INFO nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating resource usage from migration da91c5a6-2144-44a1-bf59-4b57c28c94b6#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.495 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Starting to track outgoing migration da91c5a6-2144-44a1-bf59-4b57c28c94b6 with flavor ec610f84-c649-49d7-9c7a-a22befc31fb8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.614 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration 764541e1-aa4e-4e28-8f4e-f18e536761c9 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.615 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Instance 7e534904-afa6-40ef-bb5a-ac4971f60d75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.615 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration da91c5a6-2144-44a1-bf59-4b57c28c94b6 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.616 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.616 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=832MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.621 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:05 np0005544118 podman[197639]: time="2025-12-03T14:21:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:21:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:21:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:21:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:21:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.732 187287 DEBUG nova.compute.provider_tree [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.750 187287 DEBUG nova.scheduler.client.report [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.769 187287 DEBUG nova.compute.resource_tracker [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.770 187287 DEBUG oslo_concurrency.lockutils [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.774 187287 INFO nova.compute.manager [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.855 187287 INFO nova.scheduler.client.report [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Deleted allocation for migration 764541e1-aa4e-4e28-8f4e-f18e536761c9#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.856 187287 DEBUG nova.virt.libvirt.driver [None req-71ef8057-89ea-413e-8fb7-7b3b2ddc6b9c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.934 187287 DEBUG nova.network.neutron [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 630952de-d907-4370-aed3-0bec512896a1] Updating instance_info_cache with network_info: [{"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.957 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-630952de-d907-4370-aed3-0bec512896a1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.957 187287 DEBUG nova.objects.instance [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 630952de-d907-4370-aed3-0bec512896a1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.980 187287 DEBUG nova.virt.libvirt.host [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.980 187287 INFO nova.virt.libvirt.host [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] UEFI support detected#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.982 187287 DEBUG nova.virt.libvirt.vif [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1704045789',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1704045789',id=5,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:20:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-ymg8wtcd',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:20:59Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=630952de-d907-4370-aed3-0bec512896a1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.983 187287 DEBUG nova.network.os_vif_util [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "address": "fa:16:3e:17:e7:b4", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49af6e1a-ee", "ovs_interfaceid": "49af6e1a-ee13-46b0-b0b8-e99e240fb96d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.984 187287 DEBUG nova.network.os_vif_util [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.984 187287 DEBUG os_vif [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.986 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.986 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49af6e1a-ee, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.987 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.990 187287 INFO os_vif [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:17:e7:b4,bridge_name='br-int',has_traffic_filtering=True,id=49af6e1a-ee13-46b0-b0b8-e99e240fb96d,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49af6e1a-ee')#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.990 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:05 np0005544118 nova_compute[187283]: 2025-12-03 14:21:05.990 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.053 187287 DEBUG nova.compute.provider_tree [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.067 187287 DEBUG nova.scheduler.client.report [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.105 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.204 187287 INFO nova.scheduler.client.report [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Deleted allocation for migration da91c5a6-2144-44a1-bf59-4b57c28c94b6#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.245 187287 DEBUG oslo_concurrency.lockutils [None req-1e2640fd-a895-489b-b5ef-b17b64eac19b b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "630952de-d907-4370-aed3-0bec512896a1" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.446s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.654 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.654 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.655 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.655 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:06 np0005544118 nova_compute[187283]: 2025-12-03 14:21:06.655 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.274 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764771652.2728593, 630952de-d907-4370-aed3-0bec512896a1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.275 187287 INFO nova.compute.manager [-] [instance: 630952de-d907-4370-aed3-0bec512896a1] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.305 187287 DEBUG nova.compute.manager [None req-5d94143e-8375-4696-9570-517833e78a0b - - - - - -] [instance: 630952de-d907-4370-aed3-0bec512896a1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.619 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.619 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.620 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:21:07 np0005544118 nova_compute[187283]: 2025-12-03 14:21:07.783 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:21:08 np0005544118 nova_compute[187283]: 2025-12-03 14:21:08.576 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:08 np0005544118 nova_compute[187283]: 2025-12-03 14:21:08.882 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.771 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.795 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.795 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.795 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.795 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:21:10 np0005544118 podman[209941]: 2025-12-03 14:21:10.84057891 +0000 UTC m=+0.072554952 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.867 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.923 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.924 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:21:10 np0005544118 nova_compute[187283]: 2025-12-03 14:21:10.980 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.133 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.134 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5683MB free_disk=73.31404876708984GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.135 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.135 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.200 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 7e534904-afa6-40ef-bb5a-ac4971f60d75 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.201 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.201 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.244 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.262 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.263 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:21:11 np0005544118 nova_compute[187283]: 2025-12-03 14:21:11.264 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.101 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.224 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764771657.2237055, 5a133b9a-7b3b-4026-bc0f-c3b4a7587999 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.225 187287 INFO nova.compute.manager [-] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.247 187287 DEBUG nova.compute.manager [None req-3b4f3f94-7503-4250-a7d1-2d53c328ba77 - - - - - -] [instance: 5a133b9a-7b3b-4026-bc0f-c3b4a7587999] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.637 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:12 np0005544118 nova_compute[187283]: 2025-12-03 14:21:12.637 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:21:13 np0005544118 nova_compute[187283]: 2025-12-03 14:21:13.579 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:13 np0005544118 nova_compute[187283]: 2025-12-03 14:21:13.884 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:14 np0005544118 podman[209968]: 2025-12-03 14:21:14.853837164 +0000 UTC m=+0.076775157 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  3 09:21:16 np0005544118 nova_compute[187283]: 2025-12-03 14:21:16.751 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:21:16 np0005544118 nova_compute[187283]: 2025-12-03 14:21:16.770 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Triggering sync for uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  3 09:21:16 np0005544118 nova_compute[187283]: 2025-12-03 14:21:16.770 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:16 np0005544118 nova_compute[187283]: 2025-12-03 14:21:16.771 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:16 np0005544118 nova_compute[187283]: 2025-12-03 14:21:16.816 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:18 np0005544118 nova_compute[187283]: 2025-12-03 14:21:18.582 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:18 np0005544118 nova_compute[187283]: 2025-12-03 14:21:18.886 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:21:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.135 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.136 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.136 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.136 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.137 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.138 187287 INFO nova.compute.manager [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Terminating instance#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.139 187287 DEBUG nova.compute.manager [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:21:22 np0005544118 kernel: tap27279299-81 (unregistering): left promiscuous mode
Dec  3 09:21:22 np0005544118 NetworkManager[55710]: <info>  [1764771682.1787] device (tap27279299-81): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00055|binding|INFO|Releasing lport 27279299-81d4-46c8-a65e-40a61fe9ef64 from this chassis (sb_readonly=0)
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.229 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00056|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 down in Southbound
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00057|binding|INFO|Removing iface tap27279299-81 ovn-installed in OVS
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.233 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.241 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.242 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.243 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08007569-27ca-4ce6-b140-d7ea7d6cd593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.244 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[faa70512-5410-4342-bda6-de928d8d1e38]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.244 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.245 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 namespace which is not needed anymore#033[00m
Dec  3 09:21:22 np0005544118 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec  3 09:21:22 np0005544118 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000001.scope: Consumed 17.229s CPU time.
Dec  3 09:21:22 np0005544118 systemd-machined[153602]: Machine qemu-2-instance-00000001 terminated.
Dec  3 09:21:22 np0005544118 kernel: tap27279299-81: entered promiscuous mode
Dec  3 09:21:22 np0005544118 NetworkManager[55710]: <info>  [1764771682.3581] manager: (tap27279299-81): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Dec  3 09:21:22 np0005544118 systemd-udevd[209991]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00058|binding|INFO|Claiming lport 27279299-81d4-46c8-a65e-40a61fe9ef64 for this chassis.
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00059|binding|INFO|27279299-81d4-46c8-a65e-40a61fe9ef64: Claiming fa:16:3e:db:54:b8 10.100.0.3
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.360 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 kernel: tap27279299-81 (unregistering): left promiscuous mode
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.376 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [NOTICE]   (209274) : haproxy version is 2.8.14-c23fe91
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [NOTICE]   (209274) : path to executable is /usr/sbin/haproxy
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [WARNING]  (209274) : Exiting Master process...
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [WARNING]  (209274) : Exiting Master process...
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [ALERT]    (209274) : Current worker (209276) exited with code 143 (Terminated)
Dec  3 09:21:22 np0005544118 neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593[209270]: [WARNING]  (209274) : All workers exited. Exiting... (0)
Dec  3 09:21:22 np0005544118 systemd[1]: libpod-df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83.scope: Deactivated successfully.
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00060|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 ovn-installed in OVS
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00061|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 up in Southbound
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.391 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00062|binding|INFO|Releasing lport 27279299-81d4-46c8-a65e-40a61fe9ef64 from this chassis (sb_readonly=1)
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00063|binding|INFO|Removing iface tap27279299-81 ovn-installed in OVS
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00064|if_status|INFO|Not setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 down as sb is readonly
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.394 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 podman[210012]: 2025-12-03 14:21:22.395662736 +0000 UTC m=+0.059979331 container died df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00065|binding|INFO|Releasing lport 27279299-81d4-46c8-a65e-40a61fe9ef64 from this chassis (sb_readonly=0)
Dec  3 09:21:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:21:22Z|00066|binding|INFO|Setting lport 27279299-81d4-46c8-a65e-40a61fe9ef64 down in Southbound
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.402 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:54:b8 10.100.0.3'], port_security=['fa:16:3e:db:54:b8 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7e534904-afa6-40ef-bb5a-ac4971f60d75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a16e69f7b8f43529e0c039245ec148d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '61818a98-fa19-49f3-b664-319194b04df2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88e93f24-79bc-400f-aed0-3ebe0ba49758, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=27279299-81d4-46c8-a65e-40a61fe9ef64) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.410 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.421 187287 INFO nova.virt.libvirt.driver [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Instance destroyed successfully.#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.422 187287 DEBUG nova.objects.instance [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lazy-loading 'resources' on Instance uuid 7e534904-afa6-40ef-bb5a-ac4971f60d75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:21:22 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83-userdata-shm.mount: Deactivated successfully.
Dec  3 09:21:22 np0005544118 systemd[1]: var-lib-containers-storage-overlay-2a004e7f20ebedf3c7598d71aadb249e524a597540892f7c6aaad7b9ca7da471-merged.mount: Deactivated successfully.
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.436 187287 DEBUG nova.virt.libvirt.vif [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:18:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-479405706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-479405706',id=1,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:19:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a16e69f7b8f43529e0c039245ec148d',ramdisk_id='',reservation_id='r-gel2j9x0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-1110081854',owner_user_name='tempest-TestExecuteActionsViaActuator-1110081854-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:19:38Z,user_data=None,user_id='a4376db45a5649bbab6eb86fb45a0248',uuid=7e534904-afa6-40ef-bb5a-ac4971f60d75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.437 187287 DEBUG nova.network.os_vif_util [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converting VIF {"id": "27279299-81d4-46c8-a65e-40a61fe9ef64", "address": "fa:16:3e:db:54:b8", "network": {"id": "08007569-27ca-4ce6-b140-d7ea7d6cd593", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-2131696460-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a16e69f7b8f43529e0c039245ec148d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27279299-81", "ovs_interfaceid": "27279299-81d4-46c8-a65e-40a61fe9ef64", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:21:22 np0005544118 podman[210012]: 2025-12-03 14:21:22.438012066 +0000 UTC m=+0.102328671 container cleanup df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.438 187287 DEBUG nova.network.os_vif_util [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.438 187287 DEBUG os_vif [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.441 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.441 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27279299-81, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.445 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.446 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:21:22 np0005544118 systemd[1]: libpod-conmon-df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83.scope: Deactivated successfully.
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.448 187287 INFO os_vif [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:db:54:b8,bridge_name='br-int',has_traffic_filtering=True,id=27279299-81d4-46c8-a65e-40a61fe9ef64,network=Network(08007569-27ca-4ce6-b140-d7ea7d6cd593),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27279299-81')#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.450 187287 INFO nova.virt.libvirt.driver [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Deleting instance files /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_del#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.451 187287 INFO nova.virt.libvirt.driver [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Deletion of /var/lib/nova/instances/7e534904-afa6-40ef-bb5a-ac4971f60d75_del complete#033[00m
Dec  3 09:21:22 np0005544118 podman[210051]: 2025-12-03 14:21:22.505084038 +0000 UTC m=+0.048306673 container remove df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.508 187287 INFO nova.compute.manager [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.509 187287 DEBUG oslo.service.loopingcall [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.510 187287 DEBUG nova.compute.manager [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.510 187287 DEBUG nova.network.neutron [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.514 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[25a2133b-06f7-49b1-a4f7-e88013dfdca3]: (4, ('Wed Dec  3 02:21:22 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 (df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83)\ndf7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83\nWed Dec  3 02:21:22 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 (df7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83)\ndf7040cf52336dc017dbf9211cc0ab0d5cd1c493c468bcd599dcc9e704506e83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.515 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2cee4bdd-2fbf-4099-a4db-bee8ee515e8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.516 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap08007569-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.518 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 kernel: tap08007569-20: left promiscuous mode
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.541 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.544 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[61033f36-9025-4630-8f20-69ce3af6009c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.567 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f2267c-a8f4-490f-a71c-f1f3b0eea065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.568 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9436588f-02bb-4d60-a92d-82e36cf6d66a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.581 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2c4685-b878-4540-9d8d-3d5d4001cba0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 372866, 'reachable_time': 28548, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210066, 'error': None, 'target': 'ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.584 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-08007569-27ca-4ce6-b140-d7ea7d6cd593 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.585 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[2faba16d-5333-4e30-8614-ce73bdbdc0eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 systemd[1]: run-netns-ovnmeta\x2d08007569\x2d27ca\x2d4ce6\x2db140\x2dd7ea7d6cd593.mount: Deactivated successfully.
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.586 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.588 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08007569-27ca-4ce6-b140-d7ea7d6cd593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.588 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[221eb199-b784-463c-868f-b0de93d38163]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.589 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 27279299-81d4-46c8-a65e-40a61fe9ef64 in datapath 08007569-27ca-4ce6-b140-d7ea7d6cd593 unbound from our chassis#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.590 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 08007569-27ca-4ce6-b140-d7ea7d6cd593, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:21:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:22.590 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a8c07b-69dd-4548-80e9-d67375e9b1d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.976 187287 DEBUG nova.compute.manager [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.977 187287 DEBUG oslo_concurrency.lockutils [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.977 187287 DEBUG oslo_concurrency.lockutils [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.978 187287 DEBUG oslo_concurrency.lockutils [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.978 187287 DEBUG nova.compute.manager [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:22 np0005544118 nova_compute[187283]: 2025-12-03 14:21:22.979 187287 DEBUG nova.compute.manager [req-ea5fb8f8-57e4-453f-b5c2-5975693d673a req-bf43bd92-d7d7-4662-83b2-b0f0a393e374 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-unplugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.392 187287 DEBUG nova.network.neutron [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.422 187287 INFO nova.compute.manager [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Took 0.91 seconds to deallocate network for instance.#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.465 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.465 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.517 187287 DEBUG nova.compute.provider_tree [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.535 187287 DEBUG nova.scheduler.client.report [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.554 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.574 187287 INFO nova.scheduler.client.report [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Deleted allocations for instance 7e534904-afa6-40ef-bb5a-ac4971f60d75#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.632 187287 DEBUG oslo_concurrency.lockutils [None req-4c2ecc06-9940-4eda-8287-31caccac4977 a4376db45a5649bbab6eb86fb45a0248 8a16e69f7b8f43529e0c039245ec148d - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.496s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:23 np0005544118 nova_compute[187283]: 2025-12-03 14:21:23.887 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:24 np0005544118 podman[210067]: 2025-12-03 14:21:24.858476293 +0000 UTC m=+0.081213798 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.050 187287 DEBUG nova.compute.manager [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.051 187287 DEBUG oslo_concurrency.lockutils [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.051 187287 DEBUG oslo_concurrency.lockutils [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.051 187287 DEBUG oslo_concurrency.lockutils [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "7e534904-afa6-40ef-bb5a-ac4971f60d75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.052 187287 DEBUG nova.compute.manager [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] No waiting events found dispatching network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.052 187287 WARNING nova.compute.manager [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received unexpected event network-vif-plugged-27279299-81d4-46c8-a65e-40a61fe9ef64 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:21:25 np0005544118 nova_compute[187283]: 2025-12-03 14:21:25.052 187287 DEBUG nova.compute.manager [req-3a5c8fbc-e7ba-4f22-8ee0-306abfb89820 req-e3a62d15-9e49-49f4-8271-2f8f8f5461e5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Received event network-vif-deleted-27279299-81d4-46c8-a65e-40a61fe9ef64 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:21:26 np0005544118 podman[210088]: 2025-12-03 14:21:26.8253589 +0000 UTC m=+0.055058436 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:21:27 np0005544118 nova_compute[187283]: 2025-12-03 14:21:27.483 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:27 np0005544118 nova_compute[187283]: 2025-12-03 14:21:27.542 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:28 np0005544118 nova_compute[187283]: 2025-12-03 14:21:28.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:30 np0005544118 podman[210112]: 2025-12-03 14:21:30.888464267 +0000 UTC m=+0.106813533 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:21:32 np0005544118 nova_compute[187283]: 2025-12-03 14:21:32.487 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:33 np0005544118 nova_compute[187283]: 2025-12-03 14:21:33.892 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:35 np0005544118 podman[197639]: time="2025-12-03T14:21:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:21:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:21:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:21:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:21:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Dec  3 09:21:37 np0005544118 nova_compute[187283]: 2025-12-03 14:21:37.419 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764771682.4186225, 7e534904-afa6-40ef-bb5a-ac4971f60d75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:21:37 np0005544118 nova_compute[187283]: 2025-12-03 14:21:37.420 187287 INFO nova.compute.manager [-] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:21:37 np0005544118 nova_compute[187283]: 2025-12-03 14:21:37.445 187287 DEBUG nova.compute.manager [None req-f9aa7fec-1a1e-42ee-ac61-edab0db3a96e - - - - - -] [instance: 7e534904-afa6-40ef-bb5a-ac4971f60d75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:21:37 np0005544118 nova_compute[187283]: 2025-12-03 14:21:37.492 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:38 np0005544118 nova_compute[187283]: 2025-12-03 14:21:38.894 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:41 np0005544118 podman[210140]: 2025-12-03 14:21:41.825834802 +0000 UTC m=+0.057866173 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Dec  3 09:21:42 np0005544118 nova_compute[187283]: 2025-12-03 14:21:42.494 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:42.943 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:21:42 np0005544118 nova_compute[187283]: 2025-12-03 14:21:42.943 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:42.944 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:21:43 np0005544118 nova_compute[187283]: 2025-12-03 14:21:43.897 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:45 np0005544118 podman[210163]: 2025-12-03 14:21:45.841660365 +0000 UTC m=+0.073547448 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:21:47 np0005544118 nova_compute[187283]: 2025-12-03 14:21:47.497 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:48 np0005544118 nova_compute[187283]: 2025-12-03 14:21:48.899 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:21:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:21:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:21:52 np0005544118 nova_compute[187283]: 2025-12-03 14:21:52.539 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:52 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:21:52.945 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:21:53 np0005544118 nova_compute[187283]: 2025-12-03 14:21:53.900 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:55 np0005544118 podman[210184]: 2025-12-03 14:21:55.865381504 +0000 UTC m=+0.098868778 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:21:57 np0005544118 nova_compute[187283]: 2025-12-03 14:21:57.541 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:21:57 np0005544118 podman[210203]: 2025-12-03 14:21:57.829115405 +0000 UTC m=+0.061704317 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:21:58 np0005544118 nova_compute[187283]: 2025-12-03 14:21:58.902 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:00 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:00Z|00067|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:22:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:00.953 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:00.954 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:00.954 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:01 np0005544118 podman[210227]: 2025-12-03 14:22:01.876315261 +0000 UTC m=+0.105613031 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.047 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.048 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.066 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.157 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.158 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.170 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.171 187287 INFO nova.compute.claims [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.264 187287 DEBUG nova.compute.provider_tree [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.279 187287 DEBUG nova.scheduler.client.report [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.298 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.299 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.344 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.344 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.362 187287 INFO nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.380 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.469 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.470 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.470 187287 INFO nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Creating image(s)#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.471 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.471 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.472 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.489 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.542 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.555 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.555 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.556 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.566 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.619 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.620 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.654 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.655 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.656 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.689 187287 DEBUG nova.policy [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e5685807f9ad4ce3bc2025cc88a7ce46', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e8b2813b7af41f980b694f72644be72', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.706 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.707 187287 DEBUG nova.virt.disk.api [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Checking if we can resize image /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.707 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.758 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.759 187287 DEBUG nova.virt.disk.api [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Cannot resize image /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.759 187287 DEBUG nova.objects.instance [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lazy-loading 'migration_context' on Instance uuid 169bfb78-29ea-4873-be18-f12232b1ee89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.775 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.775 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Ensure instance console log exists: /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.776 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.776 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:02 np0005544118 nova_compute[187283]: 2025-12-03 14:22:02.776 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.234 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Successfully created port: 7a953924-4659-41c3-8e24-4d900f93e547 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.905 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.912 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Successfully updated port: 7a953924-4659-41c3-8e24-4d900f93e547 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.925 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.926 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquired lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.926 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.992 187287 DEBUG nova.compute.manager [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-changed-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.992 187287 DEBUG nova.compute.manager [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Refreshing instance network info cache due to event network-changed-7a953924-4659-41c3-8e24-4d900f93e547. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:22:03 np0005544118 nova_compute[187283]: 2025-12-03 14:22:03.993 187287 DEBUG oslo_concurrency.lockutils [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.071 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.631 187287 DEBUG nova.network.neutron [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updating instance_info_cache with network_info: [{"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.654 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Releasing lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.654 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Instance network_info: |[{"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.655 187287 DEBUG oslo_concurrency.lockutils [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.656 187287 DEBUG nova.network.neutron [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Refreshing network info cache for port 7a953924-4659-41c3-8e24-4d900f93e547 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.662 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Start _get_guest_xml network_info=[{"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.668 187287 WARNING nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.674 187287 DEBUG nova.virt.libvirt.host [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.675 187287 DEBUG nova.virt.libvirt.host [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.677 187287 DEBUG nova.virt.libvirt.host [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.677 187287 DEBUG nova.virt.libvirt.host [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.678 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.679 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.679 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.679 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.679 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.680 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.680 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.680 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.680 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.680 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.681 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.681 187287 DEBUG nova.virt.hardware [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.684 187287 DEBUG nova.virt.libvirt.vif [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-885334645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-885334645',id=7,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e8b2813b7af41f980b694f72644be72',ramdisk_id='',reservation_id='r-gfpncl59',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2098095069',owner_user_name='tempest-TestExecuteBasicStrategy-2098095069-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:22:02Z,user_data=None,user_id='e5685807f9ad4ce3bc2025cc88a7ce46',uuid=169bfb78-29ea-4873-be18-f12232b1ee89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.685 187287 DEBUG nova.network.os_vif_util [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converting VIF {"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.685 187287 DEBUG nova.network.os_vif_util [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.686 187287 DEBUG nova.objects.instance [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lazy-loading 'pci_devices' on Instance uuid 169bfb78-29ea-4873-be18-f12232b1ee89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.698 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <uuid>169bfb78-29ea-4873-be18-f12232b1ee89</uuid>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <name>instance-00000007</name>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteBasicStrategy-server-885334645</nova:name>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:22:04</nova:creationTime>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:user uuid="e5685807f9ad4ce3bc2025cc88a7ce46">tempest-TestExecuteBasicStrategy-2098095069-project-member</nova:user>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:project uuid="6e8b2813b7af41f980b694f72644be72">tempest-TestExecuteBasicStrategy-2098095069</nova:project>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        <nova:port uuid="7a953924-4659-41c3-8e24-4d900f93e547">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="serial">169bfb78-29ea-4873-be18-f12232b1ee89</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="uuid">169bfb78-29ea-4873-be18-f12232b1ee89</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.config"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:7d:94:39"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <target dev="tap7a953924-46"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/console.log" append="off"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:22:04 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:22:04 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:22:04 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:22:04 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.699 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Preparing to wait for external event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.700 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.700 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.700 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.701 187287 DEBUG nova.virt.libvirt.vif [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-885334645',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-885334645',id=7,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e8b2813b7af41f980b694f72644be72',ramdisk_id='',reservation_id='r-gfpncl59',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-2098095069',owner_user_name='tempest-TestExecuteBasicStrategy-2098095069-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:22:02Z,user_data=None,user_id='e5685807f9ad4ce3bc2025cc88a7ce46',uuid=169bfb78-29ea-4873-be18-f12232b1ee89,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.702 187287 DEBUG nova.network.os_vif_util [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converting VIF {"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.702 187287 DEBUG nova.network.os_vif_util [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.703 187287 DEBUG os_vif [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.703 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.704 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.704 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.707 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.708 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a953924-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.708 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7a953924-46, col_values=(('external_ids', {'iface-id': '7a953924-4659-41c3-8e24-4d900f93e547', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7d:94:39', 'vm-uuid': '169bfb78-29ea-4873-be18-f12232b1ee89'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.710 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:04 np0005544118 NetworkManager[55710]: <info>  [1764771724.7119] manager: (tap7a953924-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.712 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.716 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.718 187287 INFO os_vif [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46')#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.771 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.772 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.772 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] No VIF found with MAC fa:16:3e:7d:94:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:22:04 np0005544118 nova_compute[187283]: 2025-12-03 14:22:04.773 187287 INFO nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Using config drive#033[00m
Dec  3 09:22:05 np0005544118 nova_compute[187283]: 2025-12-03 14:22:05.627 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:05 np0005544118 podman[197639]: time="2025-12-03T14:22:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:22:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:22:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:22:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:22:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.422 187287 INFO nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Creating config drive at /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.config#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.434 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4rs920oz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.579 187287 DEBUG oslo_concurrency.processutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4rs920oz" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.631 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.632 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:22:06 np0005544118 kernel: tap7a953924-46: entered promiscuous mode
Dec  3 09:22:06 np0005544118 NetworkManager[55710]: <info>  [1764771726.6780] manager: (tap7a953924-46): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Dec  3 09:22:06 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:06Z|00068|binding|INFO|Claiming lport 7a953924-4659-41c3-8e24-4d900f93e547 for this chassis.
Dec  3 09:22:06 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:06Z|00069|binding|INFO|7a953924-4659-41c3-8e24-4d900f93e547: Claiming fa:16:3e:7d:94:39 10.100.0.12
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.679 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.685 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.692 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.703 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:94:39 10.100.0.12'], port_security=['fa:16:3e:7d:94:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '169bfb78-29ea-4873-be18-f12232b1ee89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b658e100-6efa-4402-8cec-ff46a9090590', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e8b2813b7af41f980b694f72644be72', 'neutron:revision_number': '2', 'neutron:security_group_ids': '11bbe6bc-d093-4c62-ae8a-84d38be042b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1f3a8d9-b8f1-4b67-8021-a6ae7fa21fbd, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=7a953924-4659-41c3-8e24-4d900f93e547) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.707 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 7a953924-4659-41c3-8e24-4d900f93e547 in datapath b658e100-6efa-4402-8cec-ff46a9090590 bound to our chassis#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.710 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b658e100-6efa-4402-8cec-ff46a9090590#033[00m
Dec  3 09:22:06 np0005544118 systemd-machined[153602]: New machine qemu-5-instance-00000007.
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.732 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8eb0eb44-117b-47a9-ac57-00274a096c3d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.734 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb658e100-61 in ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.738 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb658e100-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.738 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[19cb4743-b386-45b6-a225-2e69e348e014]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.739 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[89408fd3-163a-44e2-b5cb-19fb6356d622]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.752 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[79341500-0fb3-456b-8c09-2552354e0178]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 systemd[1]: Started Virtual Machine qemu-5-instance-00000007.
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.762 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:06 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:06Z|00070|binding|INFO|Setting lport 7a953924-4659-41c3-8e24-4d900f93e547 ovn-installed in OVS
Dec  3 09:22:06 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:06Z|00071|binding|INFO|Setting lport 7a953924-4659-41c3-8e24-4d900f93e547 up in Southbound
Dec  3 09:22:06 np0005544118 nova_compute[187283]: 2025-12-03 14:22:06.765 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:06 np0005544118 systemd-udevd[210292]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.779 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[87500dcc-7b99-419a-90f5-44c4b2dc14ce]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 NetworkManager[55710]: <info>  [1764771726.7927] device (tap7a953924-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:22:06 np0005544118 NetworkManager[55710]: <info>  [1764771726.7949] device (tap7a953924-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.808 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[3aed36b7-5939-4136-b38d-3b9df7e554ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.814 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3c1c17-17f0-4a01-bd79-f4b8b9147035]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 NetworkManager[55710]: <info>  [1764771726.8161] manager: (tapb658e100-60): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.842 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[22a01742-059d-4c96-8c1e-e7bb33b2a61a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.846 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[77f13cee-8f28-4c6d-a261-0b8146a3b359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 NetworkManager[55710]: <info>  [1764771726.8667] device (tapb658e100-60): carrier: link connected
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.873 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[3ab75f74-2323-41cc-b3e0-990c4e907381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.890 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66c9e9ce-d4f2-4a75-9496-17db73121678]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb658e100-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:73:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388247, 'reachable_time': 19858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210322, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.902 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[692213a0-48e2-4845-8ba0-07da73ea2cd6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef4:73c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388247, 'tstamp': 388247}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210323, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.920 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ec3704-4281-4c48-9f4d-dd3c0fc0a15b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb658e100-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:73:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388247, 'reachable_time': 19858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 210324, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:06 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:06.953 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0ca9894a-6d41-47c0-bd8e-5b86b86dad44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.021 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cb7fae37-b611-4728-9b2e-da71125a1273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.022 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb658e100-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.022 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.023 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb658e100-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.025 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:07 np0005544118 NetworkManager[55710]: <info>  [1764771727.0258] manager: (tapb658e100-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec  3 09:22:07 np0005544118 kernel: tapb658e100-60: entered promiscuous mode
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.028 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.029 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb658e100-60, col_values=(('external_ids', {'iface-id': '4d87a9de-44de-4e18-91fc-35b66b76f3c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.030 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:07Z|00072|binding|INFO|Releasing lport 4d87a9de-44de-4e18-91fc-35b66b76f3c7 from this chassis (sb_readonly=0)
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.032 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b658e100-6efa-4402-8cec-ff46a9090590.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b658e100-6efa-4402-8cec-ff46a9090590.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.033 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d11d0c5d-ddcb-4a3f-a703-a56e648fc0d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.034 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-b658e100-6efa-4402-8cec-ff46a9090590
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/b658e100-6efa-4402-8cec-ff46a9090590.pid.haproxy
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID b658e100-6efa-4402-8cec-ff46a9090590
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:22:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:22:07.036 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'env', 'PROCESS_TAG=haproxy-b658e100-6efa-4402-8cec-ff46a9090590', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b658e100-6efa-4402-8cec-ff46a9090590.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.042 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.220 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771727.2202504, 169bfb78-29ea-4873-be18-f12232b1ee89 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.222 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] VM Started (Lifecycle Event)#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.248 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.254 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771727.2203622, 169bfb78-29ea-4873-be18-f12232b1ee89 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.255 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.297 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.304 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.323 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:22:07 np0005544118 podman[210363]: 2025-12-03 14:22:07.494166329 +0000 UTC m=+0.067438533 container create ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  3 09:22:07 np0005544118 systemd[1]: Started libpod-conmon-ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3.scope.
Dec  3 09:22:07 np0005544118 podman[210363]: 2025-12-03 14:22:07.457959515 +0000 UTC m=+0.031231749 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:22:07 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:22:07 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e781b3e9f04ecf3cdef3ad97552be67021d7729e8ff0d05b14fee43b27663f81/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:22:07 np0005544118 podman[210363]: 2025-12-03 14:22:07.58918598 +0000 UTC m=+0.162458264 container init ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:22:07 np0005544118 podman[210363]: 2025-12-03 14:22:07.596018996 +0000 UTC m=+0.169291230 container start ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:07 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [NOTICE]   (210382) : New worker (210384) forked
Dec  3 09:22:07 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [NOTICE]   (210382) : Loading success.
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.703 187287 DEBUG nova.network.neutron [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updated VIF entry in instance network info cache for port 7a953924-4659-41c3-8e24-4d900f93e547. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.704 187287 DEBUG nova.network.neutron [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updating instance_info_cache with network_info: [{"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.726 187287 DEBUG oslo_concurrency.lockutils [req-ade6fa7e-f7e6-4f54-941c-5f2710c418fe req-7e40c940-e5ea-4fe4-b6ee-316a2178df0a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.859 187287 DEBUG nova.compute.manager [req-d3009fd5-dd29-4367-be42-0bbdd420db73 req-4f80187b-00d7-40fb-9304-ffa9910cf704 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.860 187287 DEBUG oslo_concurrency.lockutils [req-d3009fd5-dd29-4367-be42-0bbdd420db73 req-4f80187b-00d7-40fb-9304-ffa9910cf704 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.860 187287 DEBUG oslo_concurrency.lockutils [req-d3009fd5-dd29-4367-be42-0bbdd420db73 req-4f80187b-00d7-40fb-9304-ffa9910cf704 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.861 187287 DEBUG oslo_concurrency.lockutils [req-d3009fd5-dd29-4367-be42-0bbdd420db73 req-4f80187b-00d7-40fb-9304-ffa9910cf704 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.861 187287 DEBUG nova.compute.manager [req-d3009fd5-dd29-4367-be42-0bbdd420db73 req-4f80187b-00d7-40fb-9304-ffa9910cf704 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Processing event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.862 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.865 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771727.865578, 169bfb78-29ea-4873-be18-f12232b1ee89 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.865 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.867 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.870 187287 INFO nova.virt.libvirt.driver [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Instance spawned successfully.#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.870 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.884 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.889 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.889 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.890 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.890 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.891 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.891 187287 DEBUG nova.virt.libvirt.driver [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.894 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.924 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.955 187287 INFO nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Took 5.49 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:22:07 np0005544118 nova_compute[187283]: 2025-12-03 14:22:07.955 187287 DEBUG nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:22:08 np0005544118 nova_compute[187283]: 2025-12-03 14:22:08.015 187287 INFO nova.compute.manager [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Took 5.90 seconds to build instance.#033[00m
Dec  3 09:22:08 np0005544118 nova_compute[187283]: 2025-12-03 14:22:08.030 187287 DEBUG oslo_concurrency.lockutils [None req-f9fcbd24-8bc2-44b6-81ba-9a73a9164a44 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:08 np0005544118 nova_compute[187283]: 2025-12-03 14:22:08.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:08 np0005544118 nova_compute[187283]: 2025-12-03 14:22:08.610 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:08 np0005544118 nova_compute[187283]: 2025-12-03 14:22:08.909 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.711 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.942 187287 DEBUG nova.compute.manager [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.942 187287 DEBUG oslo_concurrency.lockutils [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.942 187287 DEBUG oslo_concurrency.lockutils [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.943 187287 DEBUG oslo_concurrency.lockutils [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.943 187287 DEBUG nova.compute.manager [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] No waiting events found dispatching network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:22:09 np0005544118 nova_compute[187283]: 2025-12-03 14:22:09.943 187287 WARNING nova.compute.manager [req-e3189f7c-ccfa-46af-b6ef-a5bcc8e3c3a7 req-e86e2c69-82c8-4b4a-bec1-dbe3b961b304 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received unexpected event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.632 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.708 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.773 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.774 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.837 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.995 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.996 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5687MB free_disk=73.33941650390625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.996 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:22:10 np0005544118 nova_compute[187283]: 2025-12-03 14:22:10.996 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.120 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 169bfb78-29ea-4873-be18-f12232b1ee89 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.121 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.121 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.169 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.190 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.241 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:22:11 np0005544118 nova_compute[187283]: 2025-12-03 14:22:11.242 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.245s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:22:12 np0005544118 podman[210401]: 2025-12-03 14:22:12.844391849 +0000 UTC m=+0.078955246 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:22:13 np0005544118 nova_compute[187283]: 2025-12-03 14:22:13.243 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:13 np0005544118 nova_compute[187283]: 2025-12-03 14:22:13.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:13 np0005544118 nova_compute[187283]: 2025-12-03 14:22:13.913 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:14 np0005544118 nova_compute[187283]: 2025-12-03 14:22:14.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:22:14 np0005544118 nova_compute[187283]: 2025-12-03 14:22:14.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:22:14 np0005544118 nova_compute[187283]: 2025-12-03 14:22:14.714 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:16 np0005544118 podman[210422]: 2025-12-03 14:22:16.832373935 +0000 UTC m=+0.056369122 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  3 09:22:18 np0005544118 nova_compute[187283]: 2025-12-03 14:22:18.915 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:22:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:22:19 np0005544118 nova_compute[187283]: 2025-12-03 14:22:19.715 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:21Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7d:94:39 10.100.0.12
Dec  3 09:22:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:21Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7d:94:39 10.100.0.12
Dec  3 09:22:23 np0005544118 nova_compute[187283]: 2025-12-03 14:22:23.916 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:24 np0005544118 nova_compute[187283]: 2025-12-03 14:22:24.718 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:26 np0005544118 podman[210459]: 2025-12-03 14:22:26.82360902 +0000 UTC m=+0.050258556 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  3 09:22:28 np0005544118 podman[210479]: 2025-12-03 14:22:28.855054721 +0000 UTC m=+0.074667129 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:22:28 np0005544118 nova_compute[187283]: 2025-12-03 14:22:28.920 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:29 np0005544118 nova_compute[187283]: 2025-12-03 14:22:29.720 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:32 np0005544118 podman[210503]: 2025-12-03 14:22:32.867750109 +0000 UTC m=+0.097557761 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  3 09:22:33 np0005544118 nova_compute[187283]: 2025-12-03 14:22:33.922 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:34 np0005544118 nova_compute[187283]: 2025-12-03 14:22:34.755 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:35 np0005544118 podman[197639]: time="2025-12-03T14:22:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:22:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:22:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:22:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:22:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3050 "" "Go-http-client/1.1"
Dec  3 09:22:38 np0005544118 nova_compute[187283]: 2025-12-03 14:22:38.927 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:39 np0005544118 nova_compute[187283]: 2025-12-03 14:22:39.762 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:43 np0005544118 podman[210530]: 2025-12-03 14:22:43.872076734 +0000 UTC m=+0.102561187 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:22:43 np0005544118 nova_compute[187283]: 2025-12-03 14:22:43.931 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:44 np0005544118 nova_compute[187283]: 2025-12-03 14:22:44.764 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:47 np0005544118 podman[210551]: 2025-12-03 14:22:47.8971939 +0000 UTC m=+0.115713834 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:22:48 np0005544118 nova_compute[187283]: 2025-12-03 14:22:48.933 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:22:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:22:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:22:49 np0005544118 nova_compute[187283]: 2025-12-03 14:22:49.766 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:49 np0005544118 ovn_controller[95637]: 2025-12-03T14:22:49Z|00073|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Dec  3 09:22:53 np0005544118 nova_compute[187283]: 2025-12-03 14:22:53.934 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:54 np0005544118 nova_compute[187283]: 2025-12-03 14:22:54.767 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:57 np0005544118 podman[210572]: 2025-12-03 14:22:57.866952131 +0000 UTC m=+0.090029506 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:22:58 np0005544118 nova_compute[187283]: 2025-12-03 14:22:58.936 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:59 np0005544118 nova_compute[187283]: 2025-12-03 14:22:59.769 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:22:59 np0005544118 podman[210594]: 2025-12-03 14:22:59.872467877 +0000 UTC m=+0.087250400 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:23:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:00.954 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:03 np0005544118 podman[210620]: 2025-12-03 14:23:03.880160613 +0000 UTC m=+0.106323960 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:23:03 np0005544118 nova_compute[187283]: 2025-12-03 14:23:03.937 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:04 np0005544118 nova_compute[187283]: 2025-12-03 14:23:04.770 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:05 np0005544118 podman[197639]: time="2025-12-03T14:23:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:23:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:23:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:23:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:23:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3051 "" "Go-http-client/1.1"
Dec  3 09:23:06 np0005544118 nova_compute[187283]: 2025-12-03 14:23:06.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:06 np0005544118 nova_compute[187283]: 2025-12-03 14:23:06.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:23:06 np0005544118 nova_compute[187283]: 2025-12-03 14:23:06.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:23:07 np0005544118 nova_compute[187283]: 2025-12-03 14:23:07.392 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:23:07 np0005544118 nova_compute[187283]: 2025-12-03 14:23:07.392 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:23:07 np0005544118 nova_compute[187283]: 2025-12-03 14:23:07.393 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:23:07 np0005544118 nova_compute[187283]: 2025-12-03 14:23:07.393 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 169bfb78-29ea-4873-be18-f12232b1ee89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.754 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updating instance_info_cache with network_info: [{"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.776 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-169bfb78-29ea-4873-be18-f12232b1ee89" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.777 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.777 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.777 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.778 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:08 np0005544118 nova_compute[187283]: 2025-12-03 14:23:08.940 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:09 np0005544118 nova_compute[187283]: 2025-12-03 14:23:09.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:09 np0005544118 nova_compute[187283]: 2025-12-03 14:23:09.772 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.778 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.779 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.779 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.780 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.869 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.961 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:12 np0005544118 nova_compute[187283]: 2025-12-03 14:23:12.963 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.045 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.296 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.299 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5689MB free_disk=73.30766296386719GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.299 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.300 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.343 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Creating tmpfile /var/lib/nova/instances/tmpg4gqj_4b to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.445 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 169bfb78-29ea-4873-be18-f12232b1ee89 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.454 187287 DEBUG nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg4gqj_4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.464 187287 WARNING nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 093f606d-3ef8-48cf-af31-20ce774d31ec has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.465 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.465 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.565 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.580 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.581 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.582 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.282s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:13 np0005544118 nova_compute[187283]: 2025-12-03 14:23:13.942 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.185 187287 DEBUG nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg4gqj_4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093f606d-3ef8-48cf-af31-20ce774d31ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.209 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.209 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.210 187287 DEBUG nova.network.neutron [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.576 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:14 np0005544118 nova_compute[187283]: 2025-12-03 14:23:14.774 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:14 np0005544118 podman[210653]: 2025-12-03 14:23:14.870793563 +0000 UTC m=+0.089006148 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal)
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.601 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.624 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.624 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.711 187287 DEBUG nova.network.neutron [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Updating instance_info_cache with network_info: [{"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.732 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.735 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg4gqj_4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093f606d-3ef8-48cf-af31-20ce774d31ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.737 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Creating instance directory: /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.737 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Creating disk.info with the contents: {'/var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk': 'qcow2', '/var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.738 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.740 187287 DEBUG nova.objects.instance [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 093f606d-3ef8-48cf-af31-20ce774d31ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.779 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.834 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.835 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.835 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.847 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.919 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.920 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.965 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.966 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:16 np0005544118 nova_compute[187283]: 2025-12-03 14:23:16.966 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.022 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.023 187287 DEBUG nova.virt.disk.api [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.023 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.076 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.077 187287 DEBUG nova.virt.disk.api [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.077 187287 DEBUG nova.objects.instance [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 093f606d-3ef8-48cf-af31-20ce774d31ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.091 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.114 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config 485376" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.115 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config to /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.115 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.658 187287 DEBUG oslo_concurrency.processutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec/disk.config /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.660 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.662 187287 DEBUG nova.virt.libvirt.vif [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:22:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1605689134',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1605689134',id=8,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:22:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6e8b2813b7af41f980b694f72644be72',ramdisk_id='',reservation_id='r-p8hw6wol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2098095069',owner_user_name='tempest-TestExecuteBasicStrategy-2098095069-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:22:26Z,user_data=None,user_id='e5685807f9ad4ce3bc2025cc88a7ce46',uuid=093f606d-3ef8-48cf-af31-20ce774d31ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.663 187287 DEBUG nova.network.os_vif_util [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.665 187287 DEBUG nova.network.os_vif_util [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.666 187287 DEBUG os_vif [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.667 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.668 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.669 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.673 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.674 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7aa734c0-08, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.675 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7aa734c0-08, col_values=(('external_ids', {'iface-id': '7aa734c0-08ba-4160-a5a1-a2e79961d225', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cb:37:11', 'vm-uuid': '093f606d-3ef8-48cf-af31-20ce774d31ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.721 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:17 np0005544118 NetworkManager[55710]: <info>  [1764771797.7217] manager: (tap7aa734c0-08): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.724 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.731 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.732 187287 INFO os_vif [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08')#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.733 187287 DEBUG nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:23:17 np0005544118 nova_compute[187283]: 2025-12-03 14:23:17.733 187287 DEBUG nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg4gqj_4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093f606d-3ef8-48cf-af31-20ce774d31ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.200 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:18.202 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:23:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:18.204 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.476 187287 DEBUG nova.network.neutron [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Port 7aa734c0-08ba-4160-a5a1-a2e79961d225 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.478 187287 DEBUG nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg4gqj_4b',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='093f606d-3ef8-48cf-af31-20ce774d31ec',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:23:18 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:23:18 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:23:18 np0005544118 podman[210697]: 2025-12-03 14:23:18.680100708 +0000 UTC m=+0.065713807 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:23:18 np0005544118 kernel: tap7aa734c0-08: entered promiscuous mode
Dec  3 09:23:18 np0005544118 NetworkManager[55710]: <info>  [1764771798.8432] manager: (tap7aa734c0-08): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec  3 09:23:18 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:18Z|00074|binding|INFO|Claiming lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 for this additional chassis.
Dec  3 09:23:18 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:18Z|00075|binding|INFO|7aa734c0-08ba-4160-a5a1-a2e79961d225: Claiming fa:16:3e:cb:37:11 10.100.0.6
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.847 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:18 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:18Z|00076|binding|INFO|Setting lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 ovn-installed in OVS
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.861 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.862 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:18 np0005544118 systemd-machined[153602]: New machine qemu-6-instance-00000008.
Dec  3 09:23:18 np0005544118 systemd[1]: Started Virtual Machine qemu-6-instance-00000008.
Dec  3 09:23:18 np0005544118 systemd-udevd[210751]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:23:18 np0005544118 NetworkManager[55710]: <info>  [1764771798.9224] device (tap7aa734c0-08): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:23:18 np0005544118 NetworkManager[55710]: <info>  [1764771798.9239] device (tap7aa734c0-08): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:23:18 np0005544118 nova_compute[187283]: 2025-12-03 14:23:18.944 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:19 np0005544118 nova_compute[187283]: 2025-12-03 14:23:19.347 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771799.3467207, 093f606d-3ef8-48cf-af31-20ce774d31ec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:23:19 np0005544118 nova_compute[187283]: 2025-12-03 14:23:19.348 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] VM Started (Lifecycle Event)#033[00m
Dec  3 09:23:19 np0005544118 nova_compute[187283]: 2025-12-03 14:23:19.373 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:23:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:23:20 np0005544118 nova_compute[187283]: 2025-12-03 14:23:20.233 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764771800.2326393, 093f606d-3ef8-48cf-af31-20ce774d31ec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:23:20 np0005544118 nova_compute[187283]: 2025-12-03 14:23:20.233 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:23:20 np0005544118 nova_compute[187283]: 2025-12-03 14:23:20.253 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:23:20 np0005544118 nova_compute[187283]: 2025-12-03 14:23:20.257 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:23:20 np0005544118 nova_compute[187283]: 2025-12-03 14:23:20.281 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:23:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:22Z|00077|binding|INFO|Claiming lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 for this chassis.
Dec  3 09:23:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:22Z|00078|binding|INFO|7aa734c0-08ba-4160-a5a1-a2e79961d225: Claiming fa:16:3e:cb:37:11 10.100.0.6
Dec  3 09:23:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:22Z|00079|binding|INFO|Setting lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 up in Southbound
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.023 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:37:11 10.100.0.6'], port_security=['fa:16:3e:cb:37:11 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '093f606d-3ef8-48cf-af31-20ce774d31ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b658e100-6efa-4402-8cec-ff46a9090590', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e8b2813b7af41f980b694f72644be72', 'neutron:revision_number': '11', 'neutron:security_group_ids': '11bbe6bc-d093-4c62-ae8a-84d38be042b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1f3a8d9-b8f1-4b67-8021-a6ae7fa21fbd, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=7aa734c0-08ba-4160-a5a1-a2e79961d225) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.024 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 7aa734c0-08ba-4160-a5a1-a2e79961d225 in datapath b658e100-6efa-4402-8cec-ff46a9090590 bound to our chassis#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.026 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b658e100-6efa-4402-8cec-ff46a9090590#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.044 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd7007e-5c7a-4440-9b65-b41eaccbac06]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.095 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[53405913-cef2-41cd-aae7-ca496ed846ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.098 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[31c0f9b8-3c7e-4ef3-be8e-22308f32ba5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.141 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[88e19b43-f5a9-4b0b-a2d1-c2857ef42779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.168 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d55574ed-8748-4a9c-b544-64fe565410ee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb658e100-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:73:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388247, 'reachable_time': 19858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210785, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.195 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[de94d1e9-8b77-4088-b6c8-80ae689b2832]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb658e100-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388259, 'tstamp': 388259}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210786, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb658e100-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388262, 'tstamp': 388262}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210786, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.198 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb658e100-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.250 187287 INFO nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Post operation of migration started#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.251 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb658e100-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.251 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.252 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb658e100-60, col_values=(('external_ids', {'iface-id': '4d87a9de-44de-4e18-91fc-35b66b76f3c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:22.252 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.252 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.721 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.912 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.913 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:23:22 np0005544118 nova_compute[187283]: 2025-12-03 14:23:22.913 187287 DEBUG nova.network.neutron [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:23:23 np0005544118 nova_compute[187283]: 2025-12-03 14:23:23.947 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.051 187287 DEBUG nova.network.neutron [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Updating instance_info_cache with network_info: [{"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.070 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-093f606d-3ef8-48cf-af31-20ce774d31ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.085 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.085 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.086 187287 DEBUG oslo_concurrency.lockutils [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:24 np0005544118 nova_compute[187283]: 2025-12-03 14:23:24.091 187287 INFO nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:23:24 np0005544118 virtqemud[186958]: Domain id=6 name='instance-00000008' uuid=093f606d-3ef8-48cf-af31-20ce774d31ec is tainted: custom-monitor
Dec  3 09:23:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:24.207 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:25 np0005544118 nova_compute[187283]: 2025-12-03 14:23:25.098 187287 INFO nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:23:26 np0005544118 nova_compute[187283]: 2025-12-03 14:23:26.103 187287 INFO nova.virt.libvirt.driver [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:23:26 np0005544118 nova_compute[187283]: 2025-12-03 14:23:26.108 187287 DEBUG nova.compute.manager [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:23:26 np0005544118 nova_compute[187283]: 2025-12-03 14:23:26.135 187287 DEBUG nova.objects.instance [None req-b6ed1d92-53b3-431a-817f-4d71fbd2b53a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:23:27 np0005544118 nova_compute[187283]: 2025-12-03 14:23:27.722 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:28 np0005544118 podman[210787]: 2025-12-03 14:23:28.86869893 +0000 UTC m=+0.072476390 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  3 09:23:28 np0005544118 nova_compute[187283]: 2025-12-03 14:23:28.950 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:30 np0005544118 podman[210806]: 2025-12-03 14:23:30.814675903 +0000 UTC m=+0.052054417 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.509 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "093f606d-3ef8-48cf-af31-20ce774d31ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.509 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.510 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.510 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.510 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.511 187287 INFO nova.compute.manager [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Terminating instance#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.512 187287 DEBUG nova.compute.manager [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:23:31 np0005544118 kernel: tap7aa734c0-08 (unregistering): left promiscuous mode
Dec  3 09:23:31 np0005544118 NetworkManager[55710]: <info>  [1764771811.5496] device (tap7aa734c0-08): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:23:31 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:31Z|00080|binding|INFO|Releasing lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 from this chassis (sb_readonly=0)
Dec  3 09:23:31 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:31Z|00081|binding|INFO|Setting lport 7aa734c0-08ba-4160-a5a1-a2e79961d225 down in Southbound
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.595 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:31Z|00082|binding|INFO|Removing iface tap7aa734c0-08 ovn-installed in OVS
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.597 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.604 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cb:37:11 10.100.0.6'], port_security=['fa:16:3e:cb:37:11 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '093f606d-3ef8-48cf-af31-20ce774d31ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b658e100-6efa-4402-8cec-ff46a9090590', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e8b2813b7af41f980b694f72644be72', 'neutron:revision_number': '13', 'neutron:security_group_ids': '11bbe6bc-d093-4c62-ae8a-84d38be042b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1f3a8d9-b8f1-4b67-8021-a6ae7fa21fbd, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=7aa734c0-08ba-4160-a5a1-a2e79961d225) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.606 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 7aa734c0-08ba-4160-a5a1-a2e79961d225 in datapath b658e100-6efa-4402-8cec-ff46a9090590 unbound from our chassis#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.609 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b658e100-6efa-4402-8cec-ff46a9090590#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.611 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.627 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[69ed36d1-0f6b-4d24-b3e6-60df324567c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec  3 09:23:31 np0005544118 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000008.scope: Consumed 1.500s CPU time.
Dec  3 09:23:31 np0005544118 systemd-machined[153602]: Machine qemu-6-instance-00000008 terminated.
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.662 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0ea66afe-cf55-479b-8388-6c9240e26c38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.667 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[68c7dd21-b482-4e1f-8126-848ebefc1c78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.699 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9c10b1-217d-4f25-9e59-f5f2efab0a68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.719 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[59114e5f-bb09-4eb4-91a1-fdf733fa230c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb658e100-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f4:73:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388247, 'reachable_time': 19858, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210842, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.741 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0f470637-abde-4767-81e4-ee1136824641]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapb658e100-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388259, 'tstamp': 388259}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210844, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapb658e100-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388262, 'tstamp': 388262}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 210844, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.744 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb658e100-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.746 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.759 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.760 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb658e100-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.760 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.761 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb658e100-60, col_values=(('external_ids', {'iface-id': '4d87a9de-44de-4e18-91fc-35b66b76f3c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:31 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:31.761 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.795 187287 INFO nova.virt.libvirt.driver [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Instance destroyed successfully.#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.796 187287 DEBUG nova.objects.instance [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lazy-loading 'resources' on Instance uuid 093f606d-3ef8-48cf-af31-20ce774d31ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.829 187287 DEBUG nova.virt.libvirt.vif [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:22:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-1605689134',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-1605689134',id=8,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:22:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e8b2813b7af41f980b694f72644be72',ramdisk_id='',reservation_id='r-p8hw6wol',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2098095069',owner_user_name='tempest-TestExecuteBasicStrategy-2098095069-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:23:26Z,user_data=None,user_id='e5685807f9ad4ce3bc2025cc88a7ce46',uuid=093f606d-3ef8-48cf-af31-20ce774d31ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.830 187287 DEBUG nova.network.os_vif_util [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converting VIF {"id": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "address": "fa:16:3e:cb:37:11", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7aa734c0-08", "ovs_interfaceid": "7aa734c0-08ba-4160-a5a1-a2e79961d225", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.831 187287 DEBUG nova.network.os_vif_util [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.831 187287 DEBUG os_vif [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.833 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.833 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7aa734c0-08, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.835 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.837 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.840 187287 INFO os_vif [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cb:37:11,bridge_name='br-int',has_traffic_filtering=True,id=7aa734c0-08ba-4160-a5a1-a2e79961d225,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7aa734c0-08')#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.840 187287 INFO nova.virt.libvirt.driver [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Deleting instance files /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec_del#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.841 187287 INFO nova.virt.libvirt.driver [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Deletion of /var/lib/nova/instances/093f606d-3ef8-48cf-af31-20ce774d31ec_del complete#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.889 187287 DEBUG nova.compute.manager [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Received event network-vif-unplugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.890 187287 DEBUG oslo_concurrency.lockutils [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.890 187287 DEBUG oslo_concurrency.lockutils [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.890 187287 DEBUG oslo_concurrency.lockutils [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.890 187287 DEBUG nova.compute.manager [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] No waiting events found dispatching network-vif-unplugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.891 187287 DEBUG nova.compute.manager [req-2fa0cb29-635e-4ba2-a048-62077454b7b9 req-96899b86-3890-45f6-b8da-0d397ca00553 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Received event network-vif-unplugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.915 187287 INFO nova.compute.manager [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.915 187287 DEBUG oslo.service.loopingcall [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.916 187287 DEBUG nova.compute.manager [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:23:31 np0005544118 nova_compute[187283]: 2025-12-03 14:23:31.916 187287 DEBUG nova.network.neutron [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.467 187287 DEBUG nova.network.neutron [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.494 187287 INFO nova.compute.manager [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Took 0.58 seconds to deallocate network for instance.#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.537 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.537 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.542 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.581 187287 INFO nova.scheduler.client.report [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Deleted allocations for instance 093f606d-3ef8-48cf-af31-20ce774d31ec#033[00m
Dec  3 09:23:32 np0005544118 nova_compute[187283]: 2025-12-03 14:23:32.638 187287 DEBUG oslo_concurrency.lockutils [None req-8d3c44f5-aac6-40aa-8b31-3f8a529328c2 e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.216 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.216 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.217 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.217 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.217 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.218 187287 INFO nova.compute.manager [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Terminating instance#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.219 187287 DEBUG nova.compute.manager [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:23:33 np0005544118 kernel: tap7a953924-46 (unregistering): left promiscuous mode
Dec  3 09:23:33 np0005544118 NetworkManager[55710]: <info>  [1764771813.2481] device (tap7a953924-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.251 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:33Z|00083|binding|INFO|Releasing lport 7a953924-4659-41c3-8e24-4d900f93e547 from this chassis (sb_readonly=0)
Dec  3 09:23:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:33Z|00084|binding|INFO|Setting lport 7a953924-4659-41c3-8e24-4d900f93e547 down in Southbound
Dec  3 09:23:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:23:33Z|00085|binding|INFO|Removing iface tap7a953924-46 ovn-installed in OVS
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.254 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.260 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7d:94:39 10.100.0.12'], port_security=['fa:16:3e:7d:94:39 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '169bfb78-29ea-4873-be18-f12232b1ee89', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b658e100-6efa-4402-8cec-ff46a9090590', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e8b2813b7af41f980b694f72644be72', 'neutron:revision_number': '4', 'neutron:security_group_ids': '11bbe6bc-d093-4c62-ae8a-84d38be042b3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1f3a8d9-b8f1-4b67-8021-a6ae7fa21fbd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=7a953924-4659-41c3-8e24-4d900f93e547) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.261 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 7a953924-4659-41c3-8e24-4d900f93e547 in datapath b658e100-6efa-4402-8cec-ff46a9090590 unbound from our chassis#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.262 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b658e100-6efa-4402-8cec-ff46a9090590, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.263 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[df8ce8d4-7877-4df6-9f36-c1dba7122774]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.264 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590 namespace which is not needed anymore#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.281 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec  3 09:23:33 np0005544118 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000007.scope: Consumed 15.996s CPU time.
Dec  3 09:23:33 np0005544118 systemd-machined[153602]: Machine qemu-5-instance-00000007 terminated.
Dec  3 09:23:33 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [NOTICE]   (210382) : haproxy version is 2.8.14-c23fe91
Dec  3 09:23:33 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [NOTICE]   (210382) : path to executable is /usr/sbin/haproxy
Dec  3 09:23:33 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [WARNING]  (210382) : Exiting Master process...
Dec  3 09:23:33 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [ALERT]    (210382) : Current worker (210384) exited with code 143 (Terminated)
Dec  3 09:23:33 np0005544118 neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590[210378]: [WARNING]  (210382) : All workers exited. Exiting... (0)
Dec  3 09:23:33 np0005544118 systemd[1]: libpod-ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3.scope: Deactivated successfully.
Dec  3 09:23:33 np0005544118 podman[210885]: 2025-12-03 14:23:33.475910918 +0000 UTC m=+0.080934968 container died ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.485 187287 INFO nova.virt.libvirt.driver [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Instance destroyed successfully.#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.486 187287 DEBUG nova.objects.instance [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lazy-loading 'resources' on Instance uuid 169bfb78-29ea-4873-be18-f12232b1ee89 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.499 187287 DEBUG nova.virt.libvirt.vif [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:22:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-885334645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-885334645',id=7,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:22:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e8b2813b7af41f980b694f72644be72',ramdisk_id='',reservation_id='r-gfpncl59',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-2098095069',owner_user_name='tempest-TestExecuteBasicStrategy-2098095069-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:22:08Z,user_data=None,user_id='e5685807f9ad4ce3bc2025cc88a7ce46',uuid=169bfb78-29ea-4873-be18-f12232b1ee89,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.500 187287 DEBUG nova.network.os_vif_util [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converting VIF {"id": "7a953924-4659-41c3-8e24-4d900f93e547", "address": "fa:16:3e:7d:94:39", "network": {"id": "b658e100-6efa-4402-8cec-ff46a9090590", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-677695978-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e8b2813b7af41f980b694f72644be72", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7a953924-46", "ovs_interfaceid": "7a953924-4659-41c3-8e24-4d900f93e547", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.500 187287 DEBUG nova.network.os_vif_util [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.501 187287 DEBUG os_vif [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.502 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.502 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a953924-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.504 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.506 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.509 187287 INFO os_vif [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7d:94:39,bridge_name='br-int',has_traffic_filtering=True,id=7a953924-4659-41c3-8e24-4d900f93e547,network=Network(b658e100-6efa-4402-8cec-ff46a9090590),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7a953924-46')#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.510 187287 INFO nova.virt.libvirt.driver [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Deleting instance files /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89_del#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.510 187287 INFO nova.virt.libvirt.driver [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Deletion of /var/lib/nova/instances/169bfb78-29ea-4873-be18-f12232b1ee89_del complete#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.592 187287 INFO nova.compute.manager [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.593 187287 DEBUG oslo.service.loopingcall [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.595 187287 DEBUG nova.compute.manager [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.595 187287 DEBUG nova.network.neutron [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:23:33 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3-userdata-shm.mount: Deactivated successfully.
Dec  3 09:23:33 np0005544118 systemd[1]: var-lib-containers-storage-overlay-e781b3e9f04ecf3cdef3ad97552be67021d7729e8ff0d05b14fee43b27663f81-merged.mount: Deactivated successfully.
Dec  3 09:23:33 np0005544118 podman[210885]: 2025-12-03 14:23:33.703603982 +0000 UTC m=+0.308628022 container cleanup ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:23:33 np0005544118 systemd[1]: libpod-conmon-ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3.scope: Deactivated successfully.
Dec  3 09:23:33 np0005544118 podman[210931]: 2025-12-03 14:23:33.866427121 +0000 UTC m=+0.137724432 container remove ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.871 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0d2f2539-bbb6-4833-a772-0659f8d93ce2]: (4, ('Wed Dec  3 02:23:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590 (ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3)\nce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3\nWed Dec  3 02:23:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590 (ce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3)\nce52868212d66ca36f89f50f54d1bc8faaddfcbf3a70d22586c44edcb8272bc3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.873 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8cfded-e170-425a-9d8e-10d288adc069]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.874 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb658e100-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.876 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 kernel: tapb658e100-60: left promiscuous mode
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.877 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.882 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34ffabfd-45cb-4eb2-b2f1-742da079ee1c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.901 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d479ab4d-f1f1-45c0-ae50-e5687f82d379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.902 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e0b87e4a-3cde-40c9-ac78-8a47b70adc03]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.906 187287 DEBUG nova.compute.manager [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-unplugged-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.906 187287 DEBUG oslo_concurrency.lockutils [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.906 187287 DEBUG oslo_concurrency.lockutils [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.907 187287 DEBUG oslo_concurrency.lockutils [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.907 187287 DEBUG nova.compute.manager [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] No waiting events found dispatching network-vif-unplugged-7a953924-4659-41c3-8e24-4d900f93e547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.907 187287 DEBUG nova.compute.manager [req-037ba9bc-3fdd-41cf-bc81-a486c349a94a req-6f427993-b1b0-4248-9a7c-91e3d86022f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-unplugged-7a953924-4659-41c3-8e24-4d900f93e547 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.917 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7a84e90d-bd7b-4def-8203-62bff0b7d846]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388240, 'reachable_time': 37494, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 210945, 'error': None, 'target': 'ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 systemd[1]: run-netns-ovnmeta\x2db658e100\x2d6efa\x2d4402\x2d8cec\x2dff46a9090590.mount: Deactivated successfully.
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.922 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b658e100-6efa-4402-8cec-ff46a9090590 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:23:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:23:33.923 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4f652f-b42c-42c9-b303-447c3e4fa90d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.952 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.997 187287 DEBUG nova.compute.manager [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Received event network-vif-plugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.997 187287 DEBUG oslo_concurrency.lockutils [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.998 187287 DEBUG oslo_concurrency.lockutils [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.998 187287 DEBUG oslo_concurrency.lockutils [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "093f606d-3ef8-48cf-af31-20ce774d31ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.998 187287 DEBUG nova.compute.manager [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] No waiting events found dispatching network-vif-plugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.998 187287 WARNING nova.compute.manager [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Received unexpected event network-vif-plugged-7aa734c0-08ba-4160-a5a1-a2e79961d225 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:23:33 np0005544118 nova_compute[187283]: 2025-12-03 14:23:33.998 187287 DEBUG nova.compute.manager [req-120ec9fe-4c9c-45b6-80b2-322e315d4adb req-16884839-e9e3-4d81-8802-2e3d6ff626c1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Received event network-vif-deleted-7aa734c0-08ba-4160-a5a1-a2e79961d225 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:34 np0005544118 podman[210943]: 2025-12-03 14:23:34.007515484 +0000 UTC m=+0.088545153 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.097 187287 DEBUG nova.network.neutron [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.148 187287 INFO nova.compute.manager [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Took 0.55 seconds to deallocate network for instance.#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.199 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.200 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.239 187287 DEBUG nova.compute.provider_tree [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.254 187287 DEBUG nova.scheduler.client.report [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.276 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.297 187287 INFO nova.scheduler.client.report [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Deleted allocations for instance 169bfb78-29ea-4873-be18-f12232b1ee89#033[00m
Dec  3 09:23:34 np0005544118 nova_compute[187283]: 2025-12-03 14:23:34.353 187287 DEBUG oslo_concurrency.lockutils [None req-a9b1ca40-de78-4e7d-a96c-4186c4077dca e5685807f9ad4ce3bc2025cc88a7ce46 6e8b2813b7af41f980b694f72644be72 - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:35 np0005544118 podman[197639]: time="2025-12-03T14:23:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:23:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:23:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:23:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:23:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.971 187287 DEBUG nova.compute.manager [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.971 187287 DEBUG oslo_concurrency.lockutils [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.971 187287 DEBUG oslo_concurrency.lockutils [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.972 187287 DEBUG oslo_concurrency.lockutils [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "169bfb78-29ea-4873-be18-f12232b1ee89-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.972 187287 DEBUG nova.compute.manager [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] No waiting events found dispatching network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:23:35 np0005544118 nova_compute[187283]: 2025-12-03 14:23:35.972 187287 WARNING nova.compute.manager [req-c1a08736-58c4-4cfc-ab1b-df529e7fb7a9 req-73561d4a-2581-4873-9858-eb7059a9bd3d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received unexpected event network-vif-plugged-7a953924-4659-41c3-8e24-4d900f93e547 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:23:36 np0005544118 nova_compute[187283]: 2025-12-03 14:23:36.066 187287 DEBUG nova.compute.manager [req-a238da81-d6f8-4676-a7ee-82c740d62386 req-19e3f505-a710-4c36-a604-4fb419bc5ff8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Received event network-vif-deleted-7a953924-4659-41c3-8e24-4d900f93e547 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:23:38 np0005544118 nova_compute[187283]: 2025-12-03 14:23:38.505 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:38 np0005544118 nova_compute[187283]: 2025-12-03 14:23:38.954 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:43 np0005544118 nova_compute[187283]: 2025-12-03 14:23:43.508 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:43 np0005544118 nova_compute[187283]: 2025-12-03 14:23:43.956 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:45 np0005544118 podman[210969]: 2025-12-03 14:23:45.832422378 +0000 UTC m=+0.059479228 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, version=9.6, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Dec  3 09:23:46 np0005544118 nova_compute[187283]: 2025-12-03 14:23:46.793 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764771811.7923574, 093f606d-3ef8-48cf-af31-20ce774d31ec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:23:46 np0005544118 nova_compute[187283]: 2025-12-03 14:23:46.793 187287 INFO nova.compute.manager [-] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:23:46 np0005544118 nova_compute[187283]: 2025-12-03 14:23:46.812 187287 DEBUG nova.compute.manager [None req-c9e46a04-8d10-4576-a21b-f75a1d655e08 - - - - - -] [instance: 093f606d-3ef8-48cf-af31-20ce774d31ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:23:48 np0005544118 nova_compute[187283]: 2025-12-03 14:23:48.484 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764771813.4828272, 169bfb78-29ea-4873-be18-f12232b1ee89 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:23:48 np0005544118 nova_compute[187283]: 2025-12-03 14:23:48.484 187287 INFO nova.compute.manager [-] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:23:48 np0005544118 nova_compute[187283]: 2025-12-03 14:23:48.512 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:48 np0005544118 nova_compute[187283]: 2025-12-03 14:23:48.601 187287 DEBUG nova.compute.manager [None req-ce7da3d7-db28-4887-9204-eb1f044c9306 - - - - - -] [instance: 169bfb78-29ea-4873-be18-f12232b1ee89] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:23:48 np0005544118 podman[210991]: 2025-12-03 14:23:48.846431646 +0000 UTC m=+0.078418320 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 09:23:48 np0005544118 nova_compute[187283]: 2025-12-03 14:23:48.958 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:23:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:23:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:23:53 np0005544118 nova_compute[187283]: 2025-12-03 14:23:53.514 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:53 np0005544118 nova_compute[187283]: 2025-12-03 14:23:53.959 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:58 np0005544118 nova_compute[187283]: 2025-12-03 14:23:58.556 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:58 np0005544118 nova_compute[187283]: 2025-12-03 14:23:58.960 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:23:59 np0005544118 podman[211011]: 2025-12-03 14:23:59.86233227 +0000 UTC m=+0.090216600 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:24:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:00.955 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:24:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:24:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:24:01 np0005544118 podman[211030]: 2025-12-03 14:24:01.81664503 +0000 UTC m=+0.046273652 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:24:03 np0005544118 nova_compute[187283]: 2025-12-03 14:24:03.558 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:03 np0005544118 nova_compute[187283]: 2025-12-03 14:24:03.961 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:04 np0005544118 ovn_controller[95637]: 2025-12-03T14:24:04Z|00086|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  3 09:24:04 np0005544118 podman[211054]: 2025-12-03 14:24:04.871207623 +0000 UTC m=+0.102790358 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  3 09:24:05 np0005544118 podman[197639]: time="2025-12-03T14:24:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:24:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:24:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:24:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:24:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Dec  3 09:24:07 np0005544118 nova_compute[187283]: 2025-12-03 14:24:07.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:07 np0005544118 nova_compute[187283]: 2025-12-03 14:24:07.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:24:07 np0005544118 nova_compute[187283]: 2025-12-03 14:24:07.610 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:24:07 np0005544118 nova_compute[187283]: 2025-12-03 14:24:07.658 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:24:08 np0005544118 nova_compute[187283]: 2025-12-03 14:24:08.562 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:08 np0005544118 nova_compute[187283]: 2025-12-03 14:24:08.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:08 np0005544118 nova_compute[187283]: 2025-12-03 14:24:08.962 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:09 np0005544118 nova_compute[187283]: 2025-12-03 14:24:09.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:10 np0005544118 nova_compute[187283]: 2025-12-03 14:24:10.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:10 np0005544118 nova_compute[187283]: 2025-12-03 14:24:10.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:12 np0005544118 nova_compute[187283]: 2025-12-03 14:24:12.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.565 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.737 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.737 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.738 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.738 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.963 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.969 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.970 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5903MB free_disk=73.33639907836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.970 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:24:13 np0005544118 nova_compute[187283]: 2025-12-03 14:24:13.970 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.147 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.148 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.170 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.246 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.246 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.265 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.292 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.316 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.342 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.422 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:24:14 np0005544118 nova_compute[187283]: 2025-12-03 14:24:14.423 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:24:15 np0005544118 nova_compute[187283]: 2025-12-03 14:24:15.036 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:16 np0005544118 nova_compute[187283]: 2025-12-03 14:24:16.418 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:16 np0005544118 podman[211082]: 2025-12-03 14:24:16.850400686 +0000 UTC m=+0.074500424 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Dec  3 09:24:17 np0005544118 nova_compute[187283]: 2025-12-03 14:24:17.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:24:17 np0005544118 nova_compute[187283]: 2025-12-03 14:24:17.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:24:18 np0005544118 nova_compute[187283]: 2025-12-03 14:24:18.568 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:18.594 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:24:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:18.595 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:24:18 np0005544118 nova_compute[187283]: 2025-12-03 14:24:18.595 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:18 np0005544118 nova_compute[187283]: 2025-12-03 14:24:18.993 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:24:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:24:19 np0005544118 podman[211103]: 2025-12-03 14:24:19.889125021 +0000 UTC m=+0.104729592 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec  3 09:24:23 np0005544118 nova_compute[187283]: 2025-12-03 14:24:23.600 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:23 np0005544118 nova_compute[187283]: 2025-12-03 14:24:23.995 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:25 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:24:25.597 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:24:28 np0005544118 nova_compute[187283]: 2025-12-03 14:24:28.602 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:28 np0005544118 nova_compute[187283]: 2025-12-03 14:24:28.997 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:30 np0005544118 podman[211125]: 2025-12-03 14:24:30.832319059 +0000 UTC m=+0.061837792 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  3 09:24:32 np0005544118 podman[211144]: 2025-12-03 14:24:32.881066341 +0000 UTC m=+0.094049783 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:24:33 np0005544118 nova_compute[187283]: 2025-12-03 14:24:33.605 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:33 np0005544118 nova_compute[187283]: 2025-12-03 14:24:33.998 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:35 np0005544118 podman[197639]: time="2025-12-03T14:24:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:24:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:24:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:24:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:24:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec  3 09:24:35 np0005544118 podman[211168]: 2025-12-03 14:24:35.846339291 +0000 UTC m=+0.077184996 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:24:38 np0005544118 nova_compute[187283]: 2025-12-03 14:24:38.607 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:39 np0005544118 nova_compute[187283]: 2025-12-03 14:24:38.999 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:43 np0005544118 nova_compute[187283]: 2025-12-03 14:24:43.609 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:44 np0005544118 nova_compute[187283]: 2025-12-03 14:24:44.001 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:47 np0005544118 podman[211195]: 2025-12-03 14:24:47.870435618 +0000 UTC m=+0.096185290 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, config_id=edpm, distribution-scope=public)
Dec  3 09:24:48 np0005544118 nova_compute[187283]: 2025-12-03 14:24:48.647 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:49 np0005544118 nova_compute[187283]: 2025-12-03 14:24:49.004 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:24:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:24:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:24:50 np0005544118 ovn_controller[95637]: 2025-12-03T14:24:50Z|00087|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Dec  3 09:24:50 np0005544118 podman[211216]: 2025-12-03 14:24:50.821514155 +0000 UTC m=+0.057351141 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:24:53 np0005544118 nova_compute[187283]: 2025-12-03 14:24:53.651 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:54 np0005544118 nova_compute[187283]: 2025-12-03 14:24:54.005 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:58 np0005544118 nova_compute[187283]: 2025-12-03 14:24:58.654 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:24:59 np0005544118 nova_compute[187283]: 2025-12-03 14:24:59.008 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:25:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:25:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:25:01 np0005544118 podman[211236]: 2025-12-03 14:25:01.818480675 +0000 UTC m=+0.055097769 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  3 09:25:03 np0005544118 nova_compute[187283]: 2025-12-03 14:25:03.657 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:03 np0005544118 podman[211256]: 2025-12-03 14:25:03.83210605 +0000 UTC m=+0.064125434 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:25:04 np0005544118 nova_compute[187283]: 2025-12-03 14:25:04.008 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:05 np0005544118 podman[197639]: time="2025-12-03T14:25:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:25:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:25:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:25:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:25:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec  3 09:25:06 np0005544118 podman[211281]: 2025-12-03 14:25:06.871291667 +0000 UTC m=+0.092189982 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:25:08 np0005544118 nova_compute[187283]: 2025-12-03 14:25:08.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:08 np0005544118 nova_compute[187283]: 2025-12-03 14:25:08.659 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:09 np0005544118 nova_compute[187283]: 2025-12-03 14:25:09.010 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:09 np0005544118 nova_compute[187283]: 2025-12-03 14:25:09.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:09 np0005544118 nova_compute[187283]: 2025-12-03 14:25:09.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:25:09 np0005544118 nova_compute[187283]: 2025-12-03 14:25:09.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:25:09 np0005544118 nova_compute[187283]: 2025-12-03 14:25:09.662 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:25:10 np0005544118 nova_compute[187283]: 2025-12-03 14:25:10.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:10 np0005544118 nova_compute[187283]: 2025-12-03 14:25:10.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:12 np0005544118 nova_compute[187283]: 2025-12-03 14:25:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:13 np0005544118 nova_compute[187283]: 2025-12-03 14:25:13.661 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:14 np0005544118 nova_compute[187283]: 2025-12-03 14:25:14.011 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:14 np0005544118 nova_compute[187283]: 2025-12-03 14:25:14.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.683 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.684 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.684 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.684 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.807 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.808 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5899MB free_disk=73.33639907836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.808 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:25:15 np0005544118 nova_compute[187283]: 2025-12-03 14:25:15.808 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.108 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.109 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.135 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.394 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.396 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:25:16 np0005544118 nova_compute[187283]: 2025-12-03 14:25:16.396 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:25:18 np0005544118 nova_compute[187283]: 2025-12-03 14:25:18.665 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:18 np0005544118 podman[211308]: 2025-12-03 14:25:18.823427258 +0000 UTC m=+0.059976062 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  3 09:25:19 np0005544118 nova_compute[187283]: 2025-12-03 14:25:19.013 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:25:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:25:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:20.282 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:25:20 np0005544118 nova_compute[187283]: 2025-12-03 14:25:20.283 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:20.283 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:25:20 np0005544118 nova_compute[187283]: 2025-12-03 14:25:20.392 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:20 np0005544118 nova_compute[187283]: 2025-12-03 14:25:20.414 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:25:20 np0005544118 nova_compute[187283]: 2025-12-03 14:25:20.414 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:25:20 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:25:21 np0005544118 podman[211330]: 2025-12-03 14:25:21.877363213 +0000 UTC m=+0.101270338 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:25:23 np0005544118 nova_compute[187283]: 2025-12-03 14:25:23.717 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:24 np0005544118 nova_compute[187283]: 2025-12-03 14:25:24.014 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:25:24.286 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:25:28 np0005544118 nova_compute[187283]: 2025-12-03 14:25:28.720 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:29 np0005544118 nova_compute[187283]: 2025-12-03 14:25:29.016 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:32 np0005544118 podman[211350]: 2025-12-03 14:25:32.827864889 +0000 UTC m=+0.053466426 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:25:33 np0005544118 nova_compute[187283]: 2025-12-03 14:25:33.722 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:34 np0005544118 nova_compute[187283]: 2025-12-03 14:25:34.018 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:34 np0005544118 podman[211369]: 2025-12-03 14:25:34.806251191 +0000 UTC m=+0.044540205 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:25:35 np0005544118 podman[197639]: time="2025-12-03T14:25:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:25:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:25:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:25:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:25:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec  3 09:25:37 np0005544118 podman[211393]: 2025-12-03 14:25:37.84584609 +0000 UTC m=+0.080402505 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:25:38 np0005544118 nova_compute[187283]: 2025-12-03 14:25:38.725 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:39 np0005544118 nova_compute[187283]: 2025-12-03 14:25:39.019 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:43 np0005544118 nova_compute[187283]: 2025-12-03 14:25:43.728 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:44 np0005544118 nova_compute[187283]: 2025-12-03 14:25:44.020 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:48 np0005544118 nova_compute[187283]: 2025-12-03 14:25:48.732 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:49 np0005544118 nova_compute[187283]: 2025-12-03 14:25:49.022 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:25:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:25:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:25:49 np0005544118 podman[211419]: 2025-12-03 14:25:49.823498615 +0000 UTC m=+0.056146970 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6)
Dec  3 09:25:52 np0005544118 podman[211441]: 2025-12-03 14:25:52.829473098 +0000 UTC m=+0.061857757 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:25:53 np0005544118 nova_compute[187283]: 2025-12-03 14:25:53.758 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:54 np0005544118 nova_compute[187283]: 2025-12-03 14:25:54.024 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:58 np0005544118 nova_compute[187283]: 2025-12-03 14:25:58.761 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:25:59 np0005544118 nova_compute[187283]: 2025-12-03 14:25:59.025 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:00.956 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:00.957 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:00.957 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:03 np0005544118 nova_compute[187283]: 2025-12-03 14:26:03.763 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:03 np0005544118 podman[211462]: 2025-12-03 14:26:03.837424499 +0000 UTC m=+0.066766142 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:26:04 np0005544118 nova_compute[187283]: 2025-12-03 14:26:04.026 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:05 np0005544118 nova_compute[187283]: 2025-12-03 14:26:05.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:05 np0005544118 nova_compute[187283]: 2025-12-03 14:26:05.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:26:05 np0005544118 podman[197639]: time="2025-12-03T14:26:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:26:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:26:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:26:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:26:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2592 "" "Go-http-client/1.1"
Dec  3 09:26:05 np0005544118 podman[211481]: 2025-12-03 14:26:05.809014215 +0000 UTC m=+0.044269154 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:26:08 np0005544118 nova_compute[187283]: 2025-12-03 14:26:08.722 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:08 np0005544118 nova_compute[187283]: 2025-12-03 14:26:08.723 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:08 np0005544118 nova_compute[187283]: 2025-12-03 14:26:08.723 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:26:08 np0005544118 nova_compute[187283]: 2025-12-03 14:26:08.765 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:08 np0005544118 podman[211505]: 2025-12-03 14:26:08.857444183 +0000 UTC m=+0.089809294 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:26:09 np0005544118 nova_compute[187283]: 2025-12-03 14:26:09.028 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:09 np0005544118 nova_compute[187283]: 2025-12-03 14:26:09.384 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:26:11 np0005544118 nova_compute[187283]: 2025-12-03 14:26:11.269 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:11 np0005544118 nova_compute[187283]: 2025-12-03 14:26:11.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:11 np0005544118 nova_compute[187283]: 2025-12-03 14:26:11.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:26:11 np0005544118 nova_compute[187283]: 2025-12-03 14:26:11.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:26:12 np0005544118 nova_compute[187283]: 2025-12-03 14:26:12.070 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:26:12 np0005544118 nova_compute[187283]: 2025-12-03 14:26:12.070 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:12 np0005544118 nova_compute[187283]: 2025-12-03 14:26:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:12 np0005544118 nova_compute[187283]: 2025-12-03 14:26:12.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:13 np0005544118 nova_compute[187283]: 2025-12-03 14:26:13.806 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:14 np0005544118 nova_compute[187283]: 2025-12-03 14:26:14.029 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.084 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.085 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.636 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.783 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.784 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5899MB free_disk=73.33639907836914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.784 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.784 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.931 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.932 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.966 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.988 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.990 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:26:16 np0005544118 nova_compute[187283]: 2025-12-03 14:26:16.990 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:18 np0005544118 nova_compute[187283]: 2025-12-03 14:26:18.808 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:19 np0005544118 nova_compute[187283]: 2025-12-03 14:26:19.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:26:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:26:19 np0005544118 nova_compute[187283]: 2025-12-03 14:26:19.991 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:26:19 np0005544118 nova_compute[187283]: 2025-12-03 14:26:19.992 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:26:20 np0005544118 podman[211532]: 2025-12-03 14:26:20.878033034 +0000 UTC m=+0.096671032 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Dec  3 09:26:23 np0005544118 nova_compute[187283]: 2025-12-03 14:26:23.811 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:23 np0005544118 podman[211555]: 2025-12-03 14:26:23.812299571 +0000 UTC m=+0.046987289 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:26:24 np0005544118 nova_compute[187283]: 2025-12-03 14:26:24.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.274 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.275 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.313 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.566 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.567 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.572 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.572 187287 INFO nova.compute.claims [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:26:28 np0005544118 nova_compute[187283]: 2025-12-03 14:26:28.814 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:29 np0005544118 nova_compute[187283]: 2025-12-03 14:26:29.033 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:29 np0005544118 nova_compute[187283]: 2025-12-03 14:26:29.270 187287 DEBUG nova.compute.provider_tree [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:26:29 np0005544118 nova_compute[187283]: 2025-12-03 14:26:29.394 187287 DEBUG nova.scheduler.client.report [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:26:29 np0005544118 nova_compute[187283]: 2025-12-03 14:26:29.959 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:29 np0005544118 nova_compute[187283]: 2025-12-03 14:26:29.960 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:26:30 np0005544118 nova_compute[187283]: 2025-12-03 14:26:30.085 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:26:30 np0005544118 nova_compute[187283]: 2025-12-03 14:26:30.085 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:26:30 np0005544118 nova_compute[187283]: 2025-12-03 14:26:30.264 187287 DEBUG nova.policy [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4b237bacff6d49d2be63949942409611', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5652fdb96fc6489e99abdd765e1b1db6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:26:30 np0005544118 nova_compute[187283]: 2025-12-03 14:26:30.345 187287 INFO nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:26:30 np0005544118 nova_compute[187283]: 2025-12-03 14:26:30.470 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.134 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.136 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.136 187287 INFO nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Creating image(s)#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.137 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.138 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.139 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.156 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.219 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.220 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.221 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.237 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.309 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.311 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.461 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk 1073741824" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.462 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.463 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.524 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.525 187287 DEBUG nova.virt.disk.api [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Checking if we can resize image /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.526 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.580 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.581 187287 DEBUG nova.virt.disk.api [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Cannot resize image /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.582 187287 DEBUG nova.objects.instance [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lazy-loading 'migration_context' on Instance uuid 97d9ec2a-1d59-4429-8183-dd3d8114871b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.628 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.628 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Ensure instance console log exists: /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.629 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.629 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:32 np0005544118 nova_compute[187283]: 2025-12-03 14:26:32.629 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:33 np0005544118 nova_compute[187283]: 2025-12-03 14:26:33.817 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:34 np0005544118 nova_compute[187283]: 2025-12-03 14:26:34.034 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:34 np0005544118 podman[211590]: 2025-12-03 14:26:34.808737136 +0000 UTC m=+0.045622341 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  3 09:26:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:35.599 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:26:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:35.600 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:26:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:35.601 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:35 np0005544118 podman[197639]: time="2025-12-03T14:26:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:26:35 np0005544118 nova_compute[187283]: 2025-12-03 14:26:35.640 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:26:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:26:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:26:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2591 "" "Go-http-client/1.1"
Dec  3 09:26:36 np0005544118 nova_compute[187283]: 2025-12-03 14:26:36.344 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Successfully created port: f1cd5531-93bf-40ce-9596-abc4e2d58e88 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:26:36 np0005544118 podman[211609]: 2025-12-03 14:26:36.855619028 +0000 UTC m=+0.075355188 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:26:37 np0005544118 nova_compute[187283]: 2025-12-03 14:26:37.992 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Successfully updated port: f1cd5531-93bf-40ce-9596-abc4e2d58e88 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.027 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.027 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquired lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.027 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.125 187287 DEBUG nova.compute.manager [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-changed-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.125 187287 DEBUG nova.compute.manager [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Refreshing instance network info cache due to event network-changed-f1cd5531-93bf-40ce-9596-abc4e2d58e88. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.126 187287 DEBUG oslo_concurrency.lockutils [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.210 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:26:38 np0005544118 nova_compute[187283]: 2025-12-03 14:26:38.820 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.037 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.264 187287 DEBUG nova.network.neutron [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updating instance_info_cache with network_info: [{"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.297 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Releasing lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.297 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Instance network_info: |[{"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.298 187287 DEBUG oslo_concurrency.lockutils [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.298 187287 DEBUG nova.network.neutron [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Refreshing network info cache for port f1cd5531-93bf-40ce-9596-abc4e2d58e88 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.301 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Start _get_guest_xml network_info=[{"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.306 187287 WARNING nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.313 187287 DEBUG nova.virt.libvirt.host [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.314 187287 DEBUG nova.virt.libvirt.host [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.322 187287 DEBUG nova.virt.libvirt.host [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.323 187287 DEBUG nova.virt.libvirt.host [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.324 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.324 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.325 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.325 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.325 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.326 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.326 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.326 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.327 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.327 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.327 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.328 187287 DEBUG nova.virt.hardware [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.333 187287 DEBUG nova.virt.libvirt.vif [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:26:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1346081165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1346081165',id=10,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5652fdb96fc6489e99abdd765e1b1db6',ramdisk_id='',reservation_id='r-r0tl0d0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:26:30Z,user_data=None,user_id='4b237bacff6d49d2be63949942409611',uuid=97d9ec2a-1d59-4429-8183-dd3d8114871b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.333 187287 DEBUG nova.network.os_vif_util [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converting VIF {"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.334 187287 DEBUG nova.network.os_vif_util [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.335 187287 DEBUG nova.objects.instance [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 97d9ec2a-1d59-4429-8183-dd3d8114871b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.350 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <uuid>97d9ec2a-1d59-4429-8183-dd3d8114871b</uuid>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <name>instance-0000000a</name>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1346081165</nova:name>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:26:39</nova:creationTime>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:user uuid="4b237bacff6d49d2be63949942409611">tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member</nova:user>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:project uuid="5652fdb96fc6489e99abdd765e1b1db6">tempest-TestExecuteHostMaintenanceStrategy-2126731699</nova:project>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        <nova:port uuid="f1cd5531-93bf-40ce-9596-abc4e2d58e88">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="serial">97d9ec2a-1d59-4429-8183-dd3d8114871b</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="uuid">97d9ec2a-1d59-4429-8183-dd3d8114871b</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.config"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:f4:ec:fb"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <target dev="tapf1cd5531-93"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/console.log" append="off"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:26:39 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:26:39 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:26:39 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:26:39 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.351 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Preparing to wait for external event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.351 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.352 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.352 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.352 187287 DEBUG nova.virt.libvirt.vif [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:26:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1346081165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1346081165',id=10,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5652fdb96fc6489e99abdd765e1b1db6',ramdisk_id='',reservation_id='r-r0tl0d0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:26:30Z,user_data=None,user_id='4b237bacff6d49d2be63949942409611',uuid=97d9ec2a-1d59-4429-8183-dd3d8114871b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.353 187287 DEBUG nova.network.os_vif_util [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converting VIF {"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.353 187287 DEBUG nova.network.os_vif_util [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.353 187287 DEBUG os_vif [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.354 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.354 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.355 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.357 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.358 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1cd5531-93, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.358 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf1cd5531-93, col_values=(('external_ids', {'iface-id': 'f1cd5531-93bf-40ce-9596-abc4e2d58e88', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f4:ec:fb', 'vm-uuid': '97d9ec2a-1d59-4429-8183-dd3d8114871b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.359 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 NetworkManager[55710]: <info>  [1764771999.3603] manager: (tapf1cd5531-93): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.362 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.365 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.366 187287 INFO os_vif [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93')#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.440 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.440 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.441 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] No VIF found with MAC fa:16:3e:f4:ec:fb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.441 187287 INFO nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Using config drive#033[00m
Dec  3 09:26:39 np0005544118 podman[211637]: 2025-12-03 14:26:39.849433637 +0000 UTC m=+0.079375777 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller)
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.952 187287 INFO nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Creating config drive at /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.config#033[00m
Dec  3 09:26:39 np0005544118 nova_compute[187283]: 2025-12-03 14:26:39.958 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3jkc_x5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.084 187287 DEBUG oslo_concurrency.processutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw3jkc_x5" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:26:40 np0005544118 kernel: tapf1cd5531-93: entered promiscuous mode
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.1487] manager: (tapf1cd5531-93): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Dec  3 09:26:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:40Z|00088|binding|INFO|Claiming lport f1cd5531-93bf-40ce-9596-abc4e2d58e88 for this chassis.
Dec  3 09:26:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:40Z|00089|binding|INFO|f1cd5531-93bf-40ce-9596-abc4e2d58e88: Claiming fa:16:3e:f4:ec:fb 10.100.0.9
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.149 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.151 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.157 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 systemd-udevd[211679]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:26:40 np0005544118 systemd-machined[153602]: New machine qemu-7-instance-0000000a.
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.1917] device (tapf1cd5531-93): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.1934] device (tapf1cd5531-93): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.205 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:ec:fb 10.100.0.9'], port_security=['fa:16:3e:f4:ec:fb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '97d9ec2a-1d59-4429-8183-dd3d8114871b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5652fdb96fc6489e99abdd765e1b1db6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '209361f2-4655-48c3-a37e-97c45df4cb2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=687fd236-dd3b-44dd-bb4b-246dd0bfb051, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=f1cd5531-93bf-40ce-9596-abc4e2d58e88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.206 104491 INFO neutron.agent.ovn.metadata.agent [-] Port f1cd5531-93bf-40ce-9596-abc4e2d58e88 in datapath 9299ef1c-9afe-45d5-9eb4-e908925e3805 bound to our chassis#033[00m
Dec  3 09:26:40 np0005544118 systemd[1]: Started Virtual Machine qemu-7-instance-0000000a.
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.208 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9299ef1c-9afe-45d5-9eb4-e908925e3805#033[00m
Dec  3 09:26:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:40Z|00090|binding|INFO|Setting lport f1cd5531-93bf-40ce-9596-abc4e2d58e88 ovn-installed in OVS
Dec  3 09:26:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:40Z|00091|binding|INFO|Setting lport f1cd5531-93bf-40ce-9596-abc4e2d58e88 up in Southbound
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.213 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.220 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bb850e7e-4359-412b-9c05-3d85dfb962a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.222 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9299ef1c-91 in ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.224 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9299ef1c-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.224 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3df27805-021a-42fe-a255-7deb2bfa71ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.224 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b8afa636-defd-4bab-be5c-7508661e3e18]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.238 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c574c4-1b54-43b3-936b-2ce635e91020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.262 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66b00a87-b5c5-4047-b8bb-c0c853a30f1e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.296 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb96efc-1afb-46af-9c81-b300a2b45a2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.3059] manager: (tap9299ef1c-90): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.304 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fd4ccc7a-721c-40fe-8b20-b66df67bcb3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.337 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[bef685ad-bd46-4fb3-99f1-002d299ce426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.340 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[100cfe28-6da0-44fb-9298-fef7a4613452]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.3644] device (tap9299ef1c-90): carrier: link connected
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.372 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[31995804-64cf-4ce1-8c79-17740ae0395d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.391 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e90dc751-ab72-4a12-a7ca-64c26638a11d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9299ef1c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:82:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415597, 'reachable_time': 34797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 211713, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.407 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[50961326-ca66-4b29-8b1c-bfd66b1c4bde]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe05:8277'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415597, 'tstamp': 415597}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 211714, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.422 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[814a2e49-5c79-4820-aeef-7577bdc31ed9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9299ef1c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:82:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415597, 'reachable_time': 34797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 211715, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.453 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4d85c244-956f-4e8a-bb06-1ff2094a5668]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.506 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7df64cac-f9c5-42d7-943f-39d72b3cdae8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.507 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9299ef1c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.508 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.508 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9299ef1c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.510 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 NetworkManager[55710]: <info>  [1764772000.5112] manager: (tap9299ef1c-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec  3 09:26:40 np0005544118 kernel: tap9299ef1c-90: entered promiscuous mode
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.512 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.516 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9299ef1c-90, col_values=(('external_ids', {'iface-id': '3c821d3f-b8a3-4997-aa8d-bf80d29cf36a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.517 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:40Z|00092|binding|INFO|Releasing lport 3c821d3f-b8a3-4997-aa8d-bf80d29cf36a from this chassis (sb_readonly=0)
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.518 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.521 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9299ef1c-9afe-45d5-9eb4-e908925e3805.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9299ef1c-9afe-45d5-9eb4-e908925e3805.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.528 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6f972269-7491-468d-a477-19ff654196bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.529 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.532 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-9299ef1c-9afe-45d5-9eb4-e908925e3805
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/9299ef1c-9afe-45d5-9eb4-e908925e3805.pid.haproxy
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 9299ef1c-9afe-45d5-9eb4-e908925e3805
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:26:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:26:40.533 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'env', 'PROCESS_TAG=haproxy-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9299ef1c-9afe-45d5-9eb4-e908925e3805.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.687 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772000.686844, 97d9ec2a-1d59-4429-8183-dd3d8114871b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.687 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] VM Started (Lifecycle Event)#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.708 187287 DEBUG nova.compute.manager [req-6ebf2f09-ceeb-4aa8-a371-f328bbb9d6f9 req-55413e3b-a291-4dec-9b52-be9a9ee074df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.708 187287 DEBUG oslo_concurrency.lockutils [req-6ebf2f09-ceeb-4aa8-a371-f328bbb9d6f9 req-55413e3b-a291-4dec-9b52-be9a9ee074df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.709 187287 DEBUG oslo_concurrency.lockutils [req-6ebf2f09-ceeb-4aa8-a371-f328bbb9d6f9 req-55413e3b-a291-4dec-9b52-be9a9ee074df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.709 187287 DEBUG oslo_concurrency.lockutils [req-6ebf2f09-ceeb-4aa8-a371-f328bbb9d6f9 req-55413e3b-a291-4dec-9b52-be9a9ee074df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.709 187287 DEBUG nova.compute.manager [req-6ebf2f09-ceeb-4aa8-a371-f328bbb9d6f9 req-55413e3b-a291-4dec-9b52-be9a9ee074df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Processing event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.710 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.715 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.719 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.720 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.723 187287 INFO nova.virt.libvirt.driver [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Instance spawned successfully.#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.724 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.754 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.754 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772000.6869478, 97d9ec2a-1d59-4429-8183-dd3d8114871b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.755 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.759 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.759 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.760 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.760 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.760 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.761 187287 DEBUG nova.virt.libvirt.driver [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.793 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.796 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772000.7190537, 97d9ec2a-1d59-4429-8183-dd3d8114871b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.797 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.827 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.830 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.836 187287 DEBUG nova.network.neutron [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updated VIF entry in instance network info cache for port f1cd5531-93bf-40ce-9596-abc4e2d58e88. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.837 187287 DEBUG nova.network.neutron [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updating instance_info_cache with network_info: [{"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.861 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.862 187287 DEBUG oslo_concurrency.lockutils [req-d8c5f50b-baff-4f9d-98ad-1409fe38ed10 req-2a2381e8-079e-42e0-9088-473b0f93696a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.882 187287 INFO nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Took 8.75 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.882 187287 DEBUG nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.954 187287 INFO nova.compute.manager [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Took 12.42 seconds to build instance.#033[00m
Dec  3 09:26:40 np0005544118 nova_compute[187283]: 2025-12-03 14:26:40.981 187287 DEBUG oslo_concurrency.lockutils [None req-106e3b61-0853-4967-8e37-ca7c23101133 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:40 np0005544118 podman[211754]: 2025-12-03 14:26:40.901998752 +0000 UTC m=+0.024581275 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:26:42 np0005544118 podman[211754]: 2025-12-03 14:26:42.140903656 +0000 UTC m=+1.263486159 container create f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:26:42 np0005544118 systemd[1]: Started libpod-conmon-f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96.scope.
Dec  3 09:26:42 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:26:42 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0acdb76d0f09e6428ade4b6df48835cbe0e2789d316857ff2824e9724e2e4c11/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:26:42 np0005544118 podman[211754]: 2025-12-03 14:26:42.249162675 +0000 UTC m=+1.371745168 container init f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:26:42 np0005544118 podman[211754]: 2025-12-03 14:26:42.256437154 +0000 UTC m=+1.379019647 container start f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:26:42 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [NOTICE]   (211773) : New worker (211775) forked
Dec  3 09:26:42 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [NOTICE]   (211773) : Loading success.
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.863 187287 DEBUG nova.compute.manager [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.864 187287 DEBUG oslo_concurrency.lockutils [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.864 187287 DEBUG oslo_concurrency.lockutils [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.865 187287 DEBUG oslo_concurrency.lockutils [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.865 187287 DEBUG nova.compute.manager [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] No waiting events found dispatching network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:26:42 np0005544118 nova_compute[187283]: 2025-12-03 14:26:42.865 187287 WARNING nova.compute.manager [req-e3cb06d2-8f3a-40f1-9421-d594bf28e105 req-82e58275-ed17-464c-992f-4a174282157f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received unexpected event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:26:44 np0005544118 nova_compute[187283]: 2025-12-03 14:26:44.038 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:44 np0005544118 nova_compute[187283]: 2025-12-03 14:26:44.360 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:49 np0005544118 nova_compute[187283]: 2025-12-03 14:26:49.040 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:49 np0005544118 nova_compute[187283]: 2025-12-03 14:26:49.362 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:26:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:26:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:26:51 np0005544118 podman[211786]: 2025-12-03 14:26:51.828641373 +0000 UTC m=+0.059523663 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, container_name=openstack_network_exporter)
Dec  3 09:26:54 np0005544118 nova_compute[187283]: 2025-12-03 14:26:54.041 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:54 np0005544118 nova_compute[187283]: 2025-12-03 14:26:54.364 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:54 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:54Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f4:ec:fb 10.100.0.9
Dec  3 09:26:54 np0005544118 ovn_controller[95637]: 2025-12-03T14:26:54Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f4:ec:fb 10.100.0.9
Dec  3 09:26:54 np0005544118 podman[211824]: 2025-12-03 14:26:54.843977773 +0000 UTC m=+0.076794777 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  3 09:26:59 np0005544118 nova_compute[187283]: 2025-12-03 14:26:59.044 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:26:59 np0005544118 nova_compute[187283]: 2025-12-03 14:26:59.366 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:00.958 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:00.959 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:00.960 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:04 np0005544118 nova_compute[187283]: 2025-12-03 14:27:04.046 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:04 np0005544118 nova_compute[187283]: 2025-12-03 14:27:04.368 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:05 np0005544118 podman[197639]: time="2025-12-03T14:27:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:27:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:27:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:27:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:27:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Dec  3 09:27:05 np0005544118 podman[211850]: 2025-12-03 14:27:05.825111388 +0000 UTC m=+0.051703858 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:27:07 np0005544118 podman[211869]: 2025-12-03 14:27:07.841509055 +0000 UTC m=+0.075563154 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:27:08 np0005544118 nova_compute[187283]: 2025-12-03 14:27:08.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:09 np0005544118 nova_compute[187283]: 2025-12-03 14:27:09.049 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:09 np0005544118 nova_compute[187283]: 2025-12-03 14:27:09.369 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:10 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:10Z|00093|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:27:10 np0005544118 nova_compute[187283]: 2025-12-03 14:27:10.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:10 np0005544118 podman[211893]: 2025-12-03 14:27:10.840432293 +0000 UTC m=+0.076981282 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.660 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Creating tmpfile /var/lib/nova/instances/tmpcxktbl9_ to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.661 187287 DEBUG nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcxktbl9_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.782 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.783 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.783 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:27:12 np0005544118 nova_compute[187283]: 2025-12-03 14:27:12.783 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 97d9ec2a-1d59-4429-8183-dd3d8114871b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:27:13 np0005544118 nova_compute[187283]: 2025-12-03 14:27:13.602 187287 DEBUG nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcxktbl9_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0d7692-4606-42cf-a3fe-98ccda5e9d06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:27:13 np0005544118 nova_compute[187283]: 2025-12-03 14:27:13.625 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:27:13 np0005544118 nova_compute[187283]: 2025-12-03 14:27:13.626 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:27:13 np0005544118 nova_compute[187283]: 2025-12-03 14:27:13.626 187287 DEBUG nova.network.neutron [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.051 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.372 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.812 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updating instance_info_cache with network_info: [{"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.839 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-97d9ec2a-1d59-4429-8183-dd3d8114871b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.839 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.840 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:14 np0005544118 nova_compute[187283]: 2025-12-03 14:27:14.840 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.843 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.843 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.844 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.844 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:27:16 np0005544118 nova_compute[187283]: 2025-12-03 14:27:16.932 187287 DEBUG nova.network.neutron [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Updating instance_info_cache with network_info: [{"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.642 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.644 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcxktbl9_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0d7692-4606-42cf-a3fe-98ccda5e9d06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.644 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Creating instance directory: /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.645 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Creating disk.info with the contents: {'/var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk': 'qcow2', '/var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.645 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:27:17 np0005544118 nova_compute[187283]: 2025-12-03 14:27:17.646 187287 DEBUG nova.objects.instance [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.062 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.079 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.122 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.123 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.124 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.134 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.149 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.150 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.187 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.189 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.206 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.337 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.338 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5726MB free_disk=73.30733489990234GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.338 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.339 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.390 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Migration for instance 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.417 187287 INFO nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Updating resource usage from migration 5702d01b-743e-4fd4-8d0b-70fadcb62e8e#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.417 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Starting to track incoming migration 5702d01b-743e-4fd4-8d0b-70fadcb62e8e with flavor ec610f84-c649-49d7-9c7a-a22befc31fb8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.455 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 97d9ec2a-1d59-4429-8183-dd3d8114871b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.475 187287 WARNING nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.476 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.476 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.532 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.551 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.575 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:27:18 np0005544118 nova_compute[187283]: 2025-12-03 14:27:18.576 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.053 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.331 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk 1073741824" returned: 0 in 1.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.332 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 1.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.333 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.375 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.392 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.393 187287 DEBUG nova.virt.disk.api [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.394 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:27:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.455 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.457 187287 DEBUG nova.virt.disk.api [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.457 187287 DEBUG nova.objects.instance [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.473 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.497 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config 485376" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.499 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config to /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.499 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.979 187287 DEBUG oslo_concurrency.processutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06/disk.config /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.981 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.982 187287 DEBUG nova.virt.libvirt.vif [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-906207681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-906207681',id=9,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:26:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5652fdb96fc6489e99abdd765e1b1db6',ramdisk_id='',reservation_id='r-atxaldah',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:26:20Z,user_data=None,user_id='4b237bacff6d49d2be63949942409611',uuid=9e0d7692-4606-42cf-a3fe-98ccda5e9d06,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.983 187287 DEBUG nova.network.os_vif_util [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.984 187287 DEBUG nova.network.os_vif_util [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.984 187287 DEBUG os_vif [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.985 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.986 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.987 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.991 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.991 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeee9f0a5-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:19 np0005544118 nova_compute[187283]: 2025-12-03 14:27:19.992 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeee9f0a5-2f, col_values=(('external_ids', {'iface-id': 'eee9f0a5-2fd4-4de0-8119-d7745404f4d2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:d0:35', 'vm-uuid': '9e0d7692-4606-42cf-a3fe-98ccda5e9d06'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.041 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:20 np0005544118 NetworkManager[55710]: <info>  [1764772040.0422] manager: (tapeee9f0a5-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.044 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.049 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.050 187287 INFO os_vif [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f')#033[00m
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.050 187287 DEBUG nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:27:20 np0005544118 nova_compute[187283]: 2025-12-03 14:27:20.050 187287 DEBUG nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcxktbl9_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0d7692-4606-42cf-a3fe-98ccda5e9d06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.576 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.625 187287 DEBUG nova.network.neutron [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Port eee9f0a5-2fd4-4de0-8119-d7745404f4d2 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.627 187287 DEBUG nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpcxktbl9_',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='9e0d7692-4606-42cf-a3fe-98ccda5e9d06',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:27:21 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:27:21 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:27:21 np0005544118 kernel: tapeee9f0a5-2f: entered promiscuous mode
Dec  3 09:27:21 np0005544118 NetworkManager[55710]: <info>  [1764772041.9133] manager: (tapeee9f0a5-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec  3 09:27:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:21Z|00094|binding|INFO|Claiming lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 for this additional chassis.
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.914 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:21Z|00095|binding|INFO|eee9f0a5-2fd4-4de0-8119-d7745404f4d2: Claiming fa:16:3e:c7:d0:35 10.100.0.6
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.928 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:21Z|00096|binding|INFO|Setting lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 ovn-installed in OVS
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.931 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:21 np0005544118 nova_compute[187283]: 2025-12-03 14:27:21.935 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:21 np0005544118 systemd-udevd[211994]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:27:21 np0005544118 systemd-machined[153602]: New machine qemu-8-instance-00000009.
Dec  3 09:27:21 np0005544118 NetworkManager[55710]: <info>  [1764772041.9615] device (tapeee9f0a5-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:27:21 np0005544118 NetworkManager[55710]: <info>  [1764772041.9625] device (tapeee9f0a5-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:27:21 np0005544118 systemd[1]: Started Virtual Machine qemu-8-instance-00000009.
Dec  3 09:27:21 np0005544118 podman[211974]: 2025-12-03 14:27:21.979377797 +0000 UTC m=+0.063946425 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, name=ubi9-minimal)
Dec  3 09:27:23 np0005544118 nova_compute[187283]: 2025-12-03 14:27:23.903 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772043.9030895, 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:27:23 np0005544118 nova_compute[187283]: 2025-12-03 14:27:23.904 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] VM Started (Lifecycle Event)#033[00m
Dec  3 09:27:24 np0005544118 nova_compute[187283]: 2025-12-03 14:27:24.054 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:24 np0005544118 nova_compute[187283]: 2025-12-03 14:27:24.115 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:27:24 np0005544118 nova_compute[187283]: 2025-12-03 14:27:24.734 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772044.7343266, 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:27:24 np0005544118 nova_compute[187283]: 2025-12-03 14:27:24.735 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:27:25 np0005544118 nova_compute[187283]: 2025-12-03 14:27:25.080 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:25 np0005544118 nova_compute[187283]: 2025-12-03 14:27:25.379 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:27:25 np0005544118 nova_compute[187283]: 2025-12-03 14:27:25.385 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:27:25 np0005544118 nova_compute[187283]: 2025-12-03 14:27:25.406 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:27:25 np0005544118 podman[212032]: 2025-12-03 14:27:25.865524337 +0000 UTC m=+0.087766958 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3)
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.197 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.198 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.253 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:26Z|00097|binding|INFO|Claiming lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 for this chassis.
Dec  3 09:27:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:26Z|00098|binding|INFO|eee9f0a5-2fd4-4de0-8119-d7745404f4d2: Claiming fa:16:3e:c7:d0:35 10.100.0.6
Dec  3 09:27:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:26Z|00099|binding|INFO|Setting lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 up in Southbound
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.262 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:d0:35 10.100.0.6'], port_security=['fa:16:3e:c7:d0:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9e0d7692-4606-42cf-a3fe-98ccda5e9d06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5652fdb96fc6489e99abdd765e1b1db6', 'neutron:revision_number': '11', 'neutron:security_group_ids': '209361f2-4655-48c3-a37e-97c45df4cb2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=687fd236-dd3b-44dd-bb4b-246dd0bfb051, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=eee9f0a5-2fd4-4de0-8119-d7745404f4d2) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.263 104491 INFO neutron.agent.ovn.metadata.agent [-] Port eee9f0a5-2fd4-4de0-8119-d7745404f4d2 in datapath 9299ef1c-9afe-45d5-9eb4-e908925e3805 bound to our chassis#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.265 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9299ef1c-9afe-45d5-9eb4-e908925e3805#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.281 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9bf17026-1076-4f79-85cf-2cff2efab381]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.311 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[77258722-d418-42f7-bf45-a595929c5630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.314 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[4720a7ba-9458-44b5-b253-dd7bc485a6b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.337 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[796bfcc6-8545-4b25-bdee-80ce1987dec3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.354 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66513837-5545-4b00-9c5c-2944407f2ab8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9299ef1c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:82:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415597, 'reachable_time': 36894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212057, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.367 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[219b1cb9-b078-4dde-bf53-215bc3b2fa5a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9299ef1c-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415608, 'tstamp': 415608}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212058, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9299ef1c-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415610, 'tstamp': 415610}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212058, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.369 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9299ef1c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.371 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.372 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.373 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9299ef1c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.373 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.373 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9299ef1c-90, col_values=(('external_ids', {'iface-id': '3c821d3f-b8a3-4997-aa8d-bf80d29cf36a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:26.374 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.418 187287 INFO nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Post operation of migration started#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.674 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.675 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:27:26 np0005544118 nova_compute[187283]: 2025-12-03 14:27:26.675 187287 DEBUG nova.network.neutron [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.208 187287 DEBUG nova.network.neutron [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Updating instance_info_cache with network_info: [{"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.230 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-9e0d7692-4606-42cf-a3fe-98ccda5e9d06" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.245 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.245 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.246 187287 DEBUG oslo_concurrency.lockutils [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:28 np0005544118 nova_compute[187283]: 2025-12-03 14:27:28.250 187287 INFO nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:27:28 np0005544118 virtqemud[186958]: Domain id=8 name='instance-00000009' uuid=9e0d7692-4606-42cf-a3fe-98ccda5e9d06 is tainted: custom-monitor
Dec  3 09:27:29 np0005544118 nova_compute[187283]: 2025-12-03 14:27:29.056 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:29.201 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:29 np0005544118 nova_compute[187283]: 2025-12-03 14:27:29.257 187287 INFO nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:27:30 np0005544118 nova_compute[187283]: 2025-12-03 14:27:30.123 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:30 np0005544118 nova_compute[187283]: 2025-12-03 14:27:30.262 187287 INFO nova.virt.libvirt.driver [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:27:30 np0005544118 nova_compute[187283]: 2025-12-03 14:27:30.266 187287 DEBUG nova.compute.manager [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:27:30 np0005544118 nova_compute[187283]: 2025-12-03 14:27:30.368 187287 DEBUG nova.objects.instance [None req-4eca155c-3b7e-45b8-9738-d4e9147a89fd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:27:34 np0005544118 nova_compute[187283]: 2025-12-03 14:27:34.059 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:35 np0005544118 nova_compute[187283]: 2025-12-03 14:27:35.125 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:35 np0005544118 podman[197639]: time="2025-12-03T14:27:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:27:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:27:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:27:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:27:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.368 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.370 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.370 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.371 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.371 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.373 187287 INFO nova.compute.manager [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Terminating instance#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.374 187287 DEBUG nova.compute.manager [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:27:36 np0005544118 kernel: tapf1cd5531-93 (unregistering): left promiscuous mode
Dec  3 09:27:36 np0005544118 NetworkManager[55710]: <info>  [1764772056.4183] device (tapf1cd5531-93): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:27:36 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:36Z|00100|binding|INFO|Releasing lport f1cd5531-93bf-40ce-9596-abc4e2d58e88 from this chassis (sb_readonly=0)
Dec  3 09:27:36 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:36Z|00101|binding|INFO|Setting lport f1cd5531-93bf-40ce-9596-abc4e2d58e88 down in Southbound
Dec  3 09:27:36 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:36Z|00102|binding|INFO|Removing iface tapf1cd5531-93 ovn-installed in OVS
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.424 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.426 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.436 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec  3 09:27:36 np0005544118 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000a.scope: Consumed 14.637s CPU time.
Dec  3 09:27:36 np0005544118 systemd-machined[153602]: Machine qemu-7-instance-0000000a terminated.
Dec  3 09:27:36 np0005544118 podman[212061]: 2025-12-03 14:27:36.503003049 +0000 UTC m=+0.053307603 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.507 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f4:ec:fb 10.100.0.9'], port_security=['fa:16:3e:f4:ec:fb 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '97d9ec2a-1d59-4429-8183-dd3d8114871b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5652fdb96fc6489e99abdd765e1b1db6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '209361f2-4655-48c3-a37e-97c45df4cb2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=687fd236-dd3b-44dd-bb4b-246dd0bfb051, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=f1cd5531-93bf-40ce-9596-abc4e2d58e88) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.508 104491 INFO neutron.agent.ovn.metadata.agent [-] Port f1cd5531-93bf-40ce-9596-abc4e2d58e88 in datapath 9299ef1c-9afe-45d5-9eb4-e908925e3805 unbound from our chassis#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.510 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9299ef1c-9afe-45d5-9eb4-e908925e3805#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.527 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6dde031a-9e6e-4df5-b801-78acbc626d1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.556 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[68836cb6-ee6d-466f-94c8-38d2b2ab17c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.559 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[98ee9bb9-f061-4dc7-a054-007e6bfbc37f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.588 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[75391d22-6e3d-4618-b70b-f959b708d9d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.595 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.599 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.609 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d73e4974-c185-4221-973e-356ee074ba4c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9299ef1c-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:05:82:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415597, 'reachable_time': 36894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212094, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.626 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[08e1daca-382f-4a18-b3c6-4150f295fe16]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap9299ef1c-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415608, 'tstamp': 415608}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212104, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap9299ef1c-91'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415610, 'tstamp': 415610}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212104, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.628 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9299ef1c-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.630 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.635 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.636 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9299ef1c-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.636 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.637 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9299ef1c-90, col_values=(('external_ids', {'iface-id': '3c821d3f-b8a3-4997-aa8d-bf80d29cf36a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.637 187287 INFO nova.virt.libvirt.driver [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Instance destroyed successfully.#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.638 187287 DEBUG nova.objects.instance [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lazy-loading 'resources' on Instance uuid 97d9ec2a-1d59-4429-8183-dd3d8114871b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:27:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:36.637 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.719 187287 DEBUG nova.virt.libvirt.vif [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:26:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1346081165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1346081165',id=10,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:26:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5652fdb96fc6489e99abdd765e1b1db6',ramdisk_id='',reservation_id='r-r0tl0d0y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:26:40Z,user_data=None,user_id='4b237bacff6d49d2be63949942409611',uuid=97d9ec2a-1d59-4429-8183-dd3d8114871b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.719 187287 DEBUG nova.network.os_vif_util [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converting VIF {"id": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "address": "fa:16:3e:f4:ec:fb", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf1cd5531-93", "ovs_interfaceid": "f1cd5531-93bf-40ce-9596-abc4e2d58e88", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.720 187287 DEBUG nova.network.os_vif_util [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.720 187287 DEBUG os_vif [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.721 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.722 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1cd5531-93, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.723 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.725 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.727 187287 INFO os_vif [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f4:ec:fb,bridge_name='br-int',has_traffic_filtering=True,id=f1cd5531-93bf-40ce-9596-abc4e2d58e88,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf1cd5531-93')#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.727 187287 INFO nova.virt.libvirt.driver [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Deleting instance files /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b_del#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.728 187287 INFO nova.virt.libvirt.driver [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Deletion of /var/lib/nova/instances/97d9ec2a-1d59-4429-8183-dd3d8114871b_del complete#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.821 187287 INFO nova.compute.manager [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.821 187287 DEBUG oslo.service.loopingcall [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.822 187287 DEBUG nova.compute.manager [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:27:36 np0005544118 nova_compute[187283]: 2025-12-03 14:27:36.822 187287 DEBUG nova.network.neutron [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.395 187287 DEBUG nova.compute.manager [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-unplugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.395 187287 DEBUG oslo_concurrency.lockutils [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.395 187287 DEBUG oslo_concurrency.lockutils [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.396 187287 DEBUG oslo_concurrency.lockutils [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.396 187287 DEBUG nova.compute.manager [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] No waiting events found dispatching network-vif-unplugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.396 187287 DEBUG nova.compute.manager [req-1f5e584d-9f91-4498-92d0-a782ca56be21 req-6ac86dd4-aabc-4847-a264-7317c2ae74f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-unplugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.890 187287 DEBUG nova.network.neutron [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.951 187287 DEBUG nova.compute.manager [req-60862ba1-e388-4659-a172-817283494a68 req-1f276982-6e3c-41d1-9472-773434ce5ebf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-deleted-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.952 187287 INFO nova.compute.manager [req-60862ba1-e388-4659-a172-817283494a68 req-1f276982-6e3c-41d1-9472-773434ce5ebf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Neutron deleted interface f1cd5531-93bf-40ce-9596-abc4e2d58e88; detaching it from the instance and deleting it from the info cache#033[00m
Dec  3 09:27:37 np0005544118 nova_compute[187283]: 2025-12-03 14:27:37.952 187287 DEBUG nova.network.neutron [req-60862ba1-e388-4659-a172-817283494a68 req-1f276982-6e3c-41d1-9472-773434ce5ebf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.077 187287 INFO nova.compute.manager [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Took 1.25 seconds to deallocate network for instance.#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.082 187287 DEBUG nova.compute.manager [req-60862ba1-e388-4659-a172-817283494a68 req-1f276982-6e3c-41d1-9472-773434ce5ebf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Detach interface failed, port_id=f1cd5531-93bf-40ce-9596-abc4e2d58e88, reason: Instance 97d9ec2a-1d59-4429-8183-dd3d8114871b could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.134 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.135 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.226 187287 DEBUG nova.compute.provider_tree [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.284 187287 DEBUG nova.scheduler.client.report [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.333 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.405 187287 INFO nova.scheduler.client.report [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Deleted allocations for instance 97d9ec2a-1d59-4429-8183-dd3d8114871b#033[00m
Dec  3 09:27:38 np0005544118 nova_compute[187283]: 2025-12-03 14:27:38.587 187287 DEBUG oslo_concurrency.lockutils [None req-b7cdbe6f-d6ae-431a-8c18-0a37fa1524f6 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.218s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:38 np0005544118 podman[212110]: 2025-12-03 14:27:38.819381031 +0000 UTC m=+0.047218615 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.093 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.501 187287 DEBUG nova.compute.manager [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.501 187287 DEBUG oslo_concurrency.lockutils [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.502 187287 DEBUG oslo_concurrency.lockutils [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.502 187287 DEBUG oslo_concurrency.lockutils [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "97d9ec2a-1d59-4429-8183-dd3d8114871b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.502 187287 DEBUG nova.compute.manager [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] No waiting events found dispatching network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.502 187287 WARNING nova.compute.manager [req-7f8db618-36d9-430c-9aa1-19df16062741 req-a770ed5e-b048-43bb-aa44-ab0087e422c6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Received unexpected event network-vif-plugged-f1cd5531-93bf-40ce-9596-abc4e2d58e88 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.744 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.745 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.745 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.745 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.745 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.746 187287 INFO nova.compute.manager [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Terminating instance#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.747 187287 DEBUG nova.compute.manager [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:27:39 np0005544118 kernel: tapeee9f0a5-2f (unregistering): left promiscuous mode
Dec  3 09:27:39 np0005544118 NetworkManager[55710]: <info>  [1764772059.7712] device (tapeee9f0a5-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:27:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:39Z|00103|binding|INFO|Releasing lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 from this chassis (sb_readonly=0)
Dec  3 09:27:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:39Z|00104|binding|INFO|Setting lport eee9f0a5-2fd4-4de0-8119-d7745404f4d2 down in Southbound
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.775 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:27:39Z|00105|binding|INFO|Removing iface tapeee9f0a5-2f ovn-installed in OVS
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.792 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:39 np0005544118 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Deactivated successfully.
Dec  3 09:27:39 np0005544118 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000009.scope: Consumed 3.094s CPU time.
Dec  3 09:27:39 np0005544118 systemd-machined[153602]: Machine qemu-8-instance-00000009 terminated.
Dec  3 09:27:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:39.893 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:d0:35 10.100.0.6'], port_security=['fa:16:3e:c7:d0:35 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '9e0d7692-4606-42cf-a3fe-98ccda5e9d06', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5652fdb96fc6489e99abdd765e1b1db6', 'neutron:revision_number': '13', 'neutron:security_group_ids': '209361f2-4655-48c3-a37e-97c45df4cb2d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=687fd236-dd3b-44dd-bb4b-246dd0bfb051, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=eee9f0a5-2fd4-4de0-8119-d7745404f4d2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:27:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:39.894 104491 INFO neutron.agent.ovn.metadata.agent [-] Port eee9f0a5-2fd4-4de0-8119-d7745404f4d2 in datapath 9299ef1c-9afe-45d5-9eb4-e908925e3805 unbound from our chassis#033[00m
Dec  3 09:27:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:39.895 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9299ef1c-9afe-45d5-9eb4-e908925e3805, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:27:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:39.896 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[54e8b2f1-2707-44d4-bc7c-01c7cbf8552d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:39.897 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805 namespace which is not needed anymore#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.968 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:39 np0005544118 nova_compute[187283]: 2025-12-03 14:27:39.971 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.004 187287 INFO nova.virt.libvirt.driver [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Instance destroyed successfully.#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.005 187287 DEBUG nova.objects.instance [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lazy-loading 'resources' on Instance uuid 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:27:40 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [NOTICE]   (211773) : haproxy version is 2.8.14-c23fe91
Dec  3 09:27:40 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [NOTICE]   (211773) : path to executable is /usr/sbin/haproxy
Dec  3 09:27:40 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [WARNING]  (211773) : Exiting Master process...
Dec  3 09:27:40 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [ALERT]    (211773) : Current worker (211775) exited with code 143 (Terminated)
Dec  3 09:27:40 np0005544118 neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805[211769]: [WARNING]  (211773) : All workers exited. Exiting... (0)
Dec  3 09:27:40 np0005544118 systemd[1]: libpod-f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96.scope: Deactivated successfully.
Dec  3 09:27:40 np0005544118 podman[212163]: 2025-12-03 14:27:40.034026491 +0000 UTC m=+0.049577021 container died f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  3 09:27:40 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96-userdata-shm.mount: Deactivated successfully.
Dec  3 09:27:40 np0005544118 systemd[1]: var-lib-containers-storage-overlay-0acdb76d0f09e6428ade4b6df48835cbe0e2789d316857ff2824e9724e2e4c11-merged.mount: Deactivated successfully.
Dec  3 09:27:40 np0005544118 podman[212163]: 2025-12-03 14:27:40.078401558 +0000 UTC m=+0.093952068 container cleanup f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec  3 09:27:40 np0005544118 systemd[1]: libpod-conmon-f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96.scope: Deactivated successfully.
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.122 187287 DEBUG nova.virt.libvirt.vif [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:26:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-906207681',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-906207681',id=9,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:26:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5652fdb96fc6489e99abdd765e1b1db6',ramdisk_id='',reservation_id='r-atxaldah',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-2126731699-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:27:30Z,user_data=None,user_id='4b237bacff6d49d2be63949942409611',uuid=9e0d7692-4606-42cf-a3fe-98ccda5e9d06,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.123 187287 DEBUG nova.network.os_vif_util [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converting VIF {"id": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "address": "fa:16:3e:c7:d0:35", "network": {"id": "9299ef1c-9afe-45d5-9eb4-e908925e3805", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-353660397-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5652fdb96fc6489e99abdd765e1b1db6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeee9f0a5-2f", "ovs_interfaceid": "eee9f0a5-2fd4-4de0-8119-d7745404f4d2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.124 187287 DEBUG nova.network.os_vif_util [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.125 187287 DEBUG os_vif [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.127 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.128 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeee9f0a5-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:40 np0005544118 podman[212205]: 2025-12-03 14:27:40.139592005 +0000 UTC m=+0.042561108 container remove f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.183 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.185 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.185 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[094c0125-9d9d-4941-8581-cc1d536dc5ef]: (4, ('Wed Dec  3 02:27:39 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805 (f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96)\nf58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96\nWed Dec  3 02:27:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805 (f58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96)\nf58539f60b3a63ca8ef12bdbdcab7948ca0dc6f7a3bbd6c588f5d238bbea8b96\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.187 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b4fa94fe-8bb1-4774-a2d8-6cd237a2665f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.188 187287 INFO os_vif [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:d0:35,bridge_name='br-int',has_traffic_filtering=True,id=eee9f0a5-2fd4-4de0-8119-d7745404f4d2,network=Network(9299ef1c-9afe-45d5-9eb4-e908925e3805),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeee9f0a5-2f')#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.188 187287 INFO nova.virt.libvirt.driver [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Deleting instance files /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06_del#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.189 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9299ef1c-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.189 187287 INFO nova.virt.libvirt.driver [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Deletion of /var/lib/nova/instances/9e0d7692-4606-42cf-a3fe-98ccda5e9d06_del complete#033[00m
Dec  3 09:27:40 np0005544118 kernel: tap9299ef1c-90: left promiscuous mode
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.191 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.192 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.195 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a06ff1ab-d653-4a31-9413-23d09359bc18]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.202 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.211 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9b3a758f-feb9-4c45-9ea0-bc897657b50a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.212 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66d113c4-938f-475d-84df-9bd4ca3fa5e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.226 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe3206e-da27-4f6e-9c46-5a00ad99eec3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415590, 'reachable_time': 16764, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212220, 'error': None, 'target': 'ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 systemd[1]: run-netns-ovnmeta\x2d9299ef1c\x2d9afe\x2d45d5\x2d9eb4\x2de908925e3805.mount: Deactivated successfully.
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.230 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9299ef1c-9afe-45d5-9eb4-e908925e3805 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:27:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:27:40.231 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c0115c-9248-41c9-b53b-db89140fbcee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.365 187287 INFO nova.compute.manager [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Took 0.62 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.365 187287 DEBUG oslo.service.loopingcall [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.366 187287 DEBUG nova.compute.manager [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:27:40 np0005544118 nova_compute[187283]: 2025-12-03 14:27:40.366 187287 DEBUG nova.network.neutron [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:27:41 np0005544118 podman[212221]: 2025-12-03 14:27:41.856321423 +0000 UTC m=+0.088618730 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.935 187287 DEBUG nova.compute.manager [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Received event network-vif-unplugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.936 187287 DEBUG oslo_concurrency.lockutils [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.937 187287 DEBUG oslo_concurrency.lockutils [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.938 187287 DEBUG oslo_concurrency.lockutils [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.938 187287 DEBUG nova.compute.manager [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] No waiting events found dispatching network-vif-unplugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:27:41 np0005544118 nova_compute[187283]: 2025-12-03 14:27:41.938 187287 DEBUG nova.compute.manager [req-7382bf80-23ac-451a-949c-59b0fbdb4c94 req-63708802-a018-447a-966f-87f60c6d5505 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Received event network-vif-unplugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.514 187287 DEBUG nova.network.neutron [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.578 187287 INFO nova.compute.manager [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Took 2.21 seconds to deallocate network for instance.#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.695 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.697 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.706 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:42 np0005544118 nova_compute[187283]: 2025-12-03 14:27:42.771 187287 INFO nova.scheduler.client.report [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Deleted allocations for instance 9e0d7692-4606-42cf-a3fe-98ccda5e9d06#033[00m
Dec  3 09:27:43 np0005544118 nova_compute[187283]: 2025-12-03 14:27:43.013 187287 DEBUG oslo_concurrency.lockutils [None req-9cd0a89b-cd07-49b1-92f1-70da5ccfc91f 4b237bacff6d49d2be63949942409611 5652fdb96fc6489e99abdd765e1b1db6 - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.291 187287 DEBUG nova.compute.manager [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Received event network-vif-deleted-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.292 187287 DEBUG nova.compute.manager [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Received event network-vif-plugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.292 187287 DEBUG oslo_concurrency.lockutils [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.292 187287 DEBUG oslo_concurrency.lockutils [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.293 187287 DEBUG oslo_concurrency.lockutils [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "9e0d7692-4606-42cf-a3fe-98ccda5e9d06-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.293 187287 DEBUG nova.compute.manager [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] No waiting events found dispatching network-vif-plugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:27:44 np0005544118 nova_compute[187283]: 2025-12-03 14:27:44.293 187287 WARNING nova.compute.manager [req-da7bcdbe-8566-4fde-b9ea-949372e595e2 req-5f7f7f72-f3ab-4cd8-a147-032e4d6e1f70 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Received unexpected event network-vif-plugged-eee9f0a5-2fd4-4de0-8119-d7745404f4d2 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:27:45 np0005544118 nova_compute[187283]: 2025-12-03 14:27:45.225 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:49 np0005544118 nova_compute[187283]: 2025-12-03 14:27:49.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:27:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:27:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:27:50 np0005544118 nova_compute[187283]: 2025-12-03 14:27:50.227 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:51 np0005544118 nova_compute[187283]: 2025-12-03 14:27:51.636 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772056.635197, 97d9ec2a-1d59-4429-8183-dd3d8114871b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:27:51 np0005544118 nova_compute[187283]: 2025-12-03 14:27:51.637 187287 INFO nova.compute.manager [-] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:27:51 np0005544118 nova_compute[187283]: 2025-12-03 14:27:51.780 187287 DEBUG nova.compute.manager [None req-ee192e8f-c9e3-4633-8d24-d0c548c40602 - - - - - -] [instance: 97d9ec2a-1d59-4429-8183-dd3d8114871b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:27:52 np0005544118 podman[212250]: 2025-12-03 14:27:52.836006613 +0000 UTC m=+0.063654338 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, container_name=openstack_network_exporter)
Dec  3 09:27:54 np0005544118 nova_compute[187283]: 2025-12-03 14:27:54.099 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:55 np0005544118 nova_compute[187283]: 2025-12-03 14:27:55.003 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772060.0019221, 9e0d7692-4606-42cf-a3fe-98ccda5e9d06 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:27:55 np0005544118 nova_compute[187283]: 2025-12-03 14:27:55.003 187287 INFO nova.compute.manager [-] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:27:55 np0005544118 nova_compute[187283]: 2025-12-03 14:27:55.271 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:27:55 np0005544118 nova_compute[187283]: 2025-12-03 14:27:55.386 187287 DEBUG nova.compute.manager [None req-cd0d6fcd-1967-4a6f-b2ae-4802b6f2a21e - - - - - -] [instance: 9e0d7692-4606-42cf-a3fe-98ccda5e9d06] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:27:56 np0005544118 podman[212272]: 2025-12-03 14:27:56.813212355 +0000 UTC m=+0.049285265 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:27:59 np0005544118 nova_compute[187283]: 2025-12-03 14:27:59.100 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:00 np0005544118 nova_compute[187283]: 2025-12-03 14:28:00.322 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:00.959 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:28:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:00.959 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:28:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:00.959 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:28:04 np0005544118 nova_compute[187283]: 2025-12-03 14:28:04.102 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:05 np0005544118 nova_compute[187283]: 2025-12-03 14:28:05.373 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:05 np0005544118 podman[197639]: time="2025-12-03T14:28:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:28:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:28:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:28:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:28:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2586 "" "Go-http-client/1.1"
Dec  3 09:28:06 np0005544118 podman[212292]: 2025-12-03 14:28:06.805333568 +0000 UTC m=+0.040910560 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  3 09:28:08 np0005544118 nova_compute[187283]: 2025-12-03 14:28:08.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:09 np0005544118 nova_compute[187283]: 2025-12-03 14:28:09.106 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:09 np0005544118 podman[212310]: 2025-12-03 14:28:09.823086918 +0000 UTC m=+0.054029408 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:28:10 np0005544118 nova_compute[187283]: 2025-12-03 14:28:10.375 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:10 np0005544118 nova_compute[187283]: 2025-12-03 14:28:10.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:28:12Z|00106|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec  3 09:28:12 np0005544118 nova_compute[187283]: 2025-12-03 14:28:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:12 np0005544118 nova_compute[187283]: 2025-12-03 14:28:12.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:28:12 np0005544118 nova_compute[187283]: 2025-12-03 14:28:12.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:28:12 np0005544118 nova_compute[187283]: 2025-12-03 14:28:12.634 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:28:12 np0005544118 podman[212335]: 2025-12-03 14:28:12.851317611 +0000 UTC m=+0.079740590 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  3 09:28:13 np0005544118 nova_compute[187283]: 2025-12-03 14:28:13.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:14 np0005544118 nova_compute[187283]: 2025-12-03 14:28:14.106 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:14 np0005544118 nova_compute[187283]: 2025-12-03 14:28:14.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:15 np0005544118 nova_compute[187283]: 2025-12-03 14:28:15.378 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.632 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.633 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.633 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.633 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.771 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.772 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5885MB free_disk=73.33637237548828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.772 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.773 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.900 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.901 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.924 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.944 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.965 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:28:16 np0005544118 nova_compute[187283]: 2025-12-03 14:28:16.966 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:28:17 np0005544118 nova_compute[187283]: 2025-12-03 14:28:17.965 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:18 np0005544118 nova_compute[187283]: 2025-12-03 14:28:18.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:19 np0005544118 nova_compute[187283]: 2025-12-03 14:28:19.107 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:28:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:28:20 np0005544118 nova_compute[187283]: 2025-12-03 14:28:20.380 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:20 np0005544118 nova_compute[187283]: 2025-12-03 14:28:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:28:20 np0005544118 nova_compute[187283]: 2025-12-03 14:28:20.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:28:23 np0005544118 podman[212363]: 2025-12-03 14:28:23.831383907 +0000 UTC m=+0.061739274 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:28:24 np0005544118 nova_compute[187283]: 2025-12-03 14:28:24.108 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:24 np0005544118 nova_compute[187283]: 2025-12-03 14:28:24.353 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:25 np0005544118 nova_compute[187283]: 2025-12-03 14:28:25.383 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:27 np0005544118 podman[212382]: 2025-12-03 14:28:27.868609375 +0000 UTC m=+0.097184700 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:28:29 np0005544118 nova_compute[187283]: 2025-12-03 14:28:29.111 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:30 np0005544118 nova_compute[187283]: 2025-12-03 14:28:30.386 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:34 np0005544118 nova_compute[187283]: 2025-12-03 14:28:34.113 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:35 np0005544118 nova_compute[187283]: 2025-12-03 14:28:35.388 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:35 np0005544118 podman[197639]: time="2025-12-03T14:28:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:28:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:28:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:28:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:28:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2585 "" "Go-http-client/1.1"
Dec  3 09:28:37 np0005544118 podman[212403]: 2025-12-03 14:28:37.837086694 +0000 UTC m=+0.054160832 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  3 09:28:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:38.823 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:28:38 np0005544118 nova_compute[187283]: 2025-12-03 14:28:38.823 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:38.824 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:28:39 np0005544118 nova_compute[187283]: 2025-12-03 14:28:39.116 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:40 np0005544118 nova_compute[187283]: 2025-12-03 14:28:40.391 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:40 np0005544118 podman[212422]: 2025-12-03 14:28:40.812553267 +0000 UTC m=+0.045765097 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:28:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:28:42.826 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:28:43 np0005544118 podman[212448]: 2025-12-03 14:28:43.842350615 +0000 UTC m=+0.079490963 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:28:44 np0005544118 nova_compute[187283]: 2025-12-03 14:28:44.117 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:45 np0005544118 nova_compute[187283]: 2025-12-03 14:28:45.394 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:49 np0005544118 nova_compute[187283]: 2025-12-03 14:28:49.119 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:28:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:28:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:28:50 np0005544118 nova_compute[187283]: 2025-12-03 14:28:50.396 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:54 np0005544118 nova_compute[187283]: 2025-12-03 14:28:54.121 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:54 np0005544118 podman[212475]: 2025-12-03 14:28:54.860073147 +0000 UTC m=+0.079692608 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, version=9.6, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal)
Dec  3 09:28:55 np0005544118 nova_compute[187283]: 2025-12-03 14:28:55.399 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:28:58 np0005544118 podman[212498]: 2025-12-03 14:28:58.214524141 +0000 UTC m=+0.097383655 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd)
Dec  3 09:28:59 np0005544118 nova_compute[187283]: 2025-12-03 14:28:59.123 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:00 np0005544118 ovn_controller[95637]: 2025-12-03T14:29:00Z|00107|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:29:00 np0005544118 nova_compute[187283]: 2025-12-03 14:29:00.400 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:00.960 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:29:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:00.961 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:29:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:00.961 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:29:04 np0005544118 nova_compute[187283]: 2025-12-03 14:29:04.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:05 np0005544118 nova_compute[187283]: 2025-12-03 14:29:05.403 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:05 np0005544118 podman[197639]: time="2025-12-03T14:29:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:29:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:29:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:29:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:29:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2587 "" "Go-http-client/1.1"
Dec  3 09:29:08 np0005544118 podman[212519]: 2025-12-03 14:29:08.858777657 +0000 UTC m=+0.088319242 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:29:09 np0005544118 nova_compute[187283]: 2025-12-03 14:29:09.127 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:10 np0005544118 nova_compute[187283]: 2025-12-03 14:29:10.405 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:10 np0005544118 nova_compute[187283]: 2025-12-03 14:29:10.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:10 np0005544118 nova_compute[187283]: 2025-12-03 14:29:10.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:11 np0005544118 podman[212539]: 2025-12-03 14:29:11.80934717 +0000 UTC m=+0.046388793 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:29:14 np0005544118 nova_compute[187283]: 2025-12-03 14:29:14.129 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:14 np0005544118 nova_compute[187283]: 2025-12-03 14:29:14.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:14 np0005544118 nova_compute[187283]: 2025-12-03 14:29:14.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:29:14 np0005544118 nova_compute[187283]: 2025-12-03 14:29:14.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:29:14 np0005544118 nova_compute[187283]: 2025-12-03 14:29:14.622 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:29:14 np0005544118 podman[212563]: 2025-12-03 14:29:14.889335038 +0000 UTC m=+0.101240175 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:29:15 np0005544118 nova_compute[187283]: 2025-12-03 14:29:15.407 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:15 np0005544118 nova_compute[187283]: 2025-12-03 14:29:15.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:15 np0005544118 nova_compute[187283]: 2025-12-03 14:29:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.671 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.672 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.672 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.672 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.835 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.836 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5890MB free_disk=73.33637237548828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.837 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.837 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.980 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:29:18 np0005544118 nova_compute[187283]: 2025-12-03 14:29:18.981 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.002 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.018 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.019 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.048 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.068 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.112 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.130 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.355 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.356 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:29:19 np0005544118 nova_compute[187283]: 2025-12-03 14:29:19.356 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.519s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:29:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:29:20 np0005544118 nova_compute[187283]: 2025-12-03 14:29:20.409 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:22 np0005544118 nova_compute[187283]: 2025-12-03 14:29:22.358 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:22 np0005544118 nova_compute[187283]: 2025-12-03 14:29:22.358 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:29:23 np0005544118 nova_compute[187283]: 2025-12-03 14:29:23.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:29:24 np0005544118 nova_compute[187283]: 2025-12-03 14:29:24.133 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:25 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:29:25 np0005544118 podman[212590]: 2025-12-03 14:29:25.13177502 +0000 UTC m=+0.074903095 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Dec  3 09:29:25 np0005544118 nova_compute[187283]: 2025-12-03 14:29:25.412 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:28 np0005544118 podman[212611]: 2025-12-03 14:29:28.819523654 +0000 UTC m=+0.051943939 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd)
Dec  3 09:29:29 np0005544118 nova_compute[187283]: 2025-12-03 14:29:29.134 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:30 np0005544118 nova_compute[187283]: 2025-12-03 14:29:30.415 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:34 np0005544118 nova_compute[187283]: 2025-12-03 14:29:34.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:35 np0005544118 nova_compute[187283]: 2025-12-03 14:29:35.416 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:35 np0005544118 podman[197639]: time="2025-12-03T14:29:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:29:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:29:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:29:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:29:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec  3 09:29:39 np0005544118 nova_compute[187283]: 2025-12-03 14:29:39.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:39 np0005544118 podman[212631]: 2025-12-03 14:29:39.85123524 +0000 UTC m=+0.076159160 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:29:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:39.976 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:29:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:39.976 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:29:39 np0005544118 nova_compute[187283]: 2025-12-03 14:29:39.977 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:40 np0005544118 nova_compute[187283]: 2025-12-03 14:29:40.418 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:42 np0005544118 podman[212651]: 2025-12-03 14:29:42.845453939 +0000 UTC m=+0.073955508 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:29:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:29:43.979 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:29:44 np0005544118 nova_compute[187283]: 2025-12-03 14:29:44.139 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:45 np0005544118 nova_compute[187283]: 2025-12-03 14:29:45.421 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:45 np0005544118 podman[212675]: 2025-12-03 14:29:45.880316699 +0000 UTC m=+0.111153753 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec  3 09:29:49 np0005544118 nova_compute[187283]: 2025-12-03 14:29:49.141 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:29:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:29:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:29:50 np0005544118 nova_compute[187283]: 2025-12-03 14:29:50.423 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:54 np0005544118 nova_compute[187283]: 2025-12-03 14:29:54.143 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:55 np0005544118 nova_compute[187283]: 2025-12-03 14:29:55.425 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:55 np0005544118 podman[212703]: 2025-12-03 14:29:55.818470118 +0000 UTC m=+0.052111545 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  3 09:29:59 np0005544118 nova_compute[187283]: 2025-12-03 14:29:59.145 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:29:59 np0005544118 podman[212725]: 2025-12-03 14:29:59.822917465 +0000 UTC m=+0.054130531 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:30:00 np0005544118 nova_compute[187283]: 2025-12-03 14:30:00.427 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:00.962 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:00.963 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:00.963 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.473 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.473 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.492 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.582 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.583 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.590 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.590 187287 INFO nova.compute.claims [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.696 187287 DEBUG nova.compute.provider_tree [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.711 187287 DEBUG nova.scheduler.client.report [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.733 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.734 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.781 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.781 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.799 187287 INFO nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.820 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.934 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.935 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.936 187287 INFO nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Creating image(s)#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.937 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.937 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.938 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:03 np0005544118 nova_compute[187283]: 2025-12-03 14:30:03.955 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.015 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.017 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.018 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.032 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.055 187287 DEBUG nova.policy [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.091 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.093 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.147 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.164 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk 1073741824" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.165 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.166 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.233 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.234 187287 DEBUG nova.virt.disk.api [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.235 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.310 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.312 187287 DEBUG nova.virt.disk.api [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.312 187287 DEBUG nova.objects.instance [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.334 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.335 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Ensure instance console log exists: /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.336 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.336 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.336 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:04 np0005544118 nova_compute[187283]: 2025-12-03 14:30:04.796 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Successfully created port: e855d4ad-5445-4664-b85d-8e6cd5a34cfa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.440 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.501 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Successfully updated port: e855d4ad-5445-4664-b85d-8e6cd5a34cfa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.518 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.519 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.519 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.620 187287 DEBUG nova.compute.manager [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-changed-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.620 187287 DEBUG nova.compute.manager [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Refreshing instance network info cache due to event network-changed-e855d4ad-5445-4664-b85d-8e6cd5a34cfa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.621 187287 DEBUG oslo_concurrency.lockutils [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:30:05 np0005544118 podman[197639]: time="2025-12-03T14:30:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:30:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:30:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:30:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:30:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec  3 09:30:05 np0005544118 nova_compute[187283]: 2025-12-03 14:30:05.673 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.350 187287 DEBUG nova.network.neutron [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updating instance_info_cache with network_info: [{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.370 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.371 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Instance network_info: |[{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.371 187287 DEBUG oslo_concurrency.lockutils [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.371 187287 DEBUG nova.network.neutron [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Refreshing network info cache for port e855d4ad-5445-4664-b85d-8e6cd5a34cfa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.374 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Start _get_guest_xml network_info=[{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.378 187287 WARNING nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.382 187287 DEBUG nova.virt.libvirt.host [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.383 187287 DEBUG nova.virt.libvirt.host [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.389 187287 DEBUG nova.virt.libvirt.host [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.390 187287 DEBUG nova.virt.libvirt.host [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.391 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.391 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.391 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.392 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.392 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.392 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.392 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.393 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.393 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.393 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.393 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.393 187287 DEBUG nova.virt.hardware [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.398 187287 DEBUG nova.virt.libvirt.vif [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-447465536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-447465536',id=11,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-upmbb8zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:30:03Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=db2e8954-cb1d-4623-abe8-3c580f3c26e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.398 187287 DEBUG nova.network.os_vif_util [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.398 187287 DEBUG nova.network.os_vif_util [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.400 187287 DEBUG nova.objects.instance [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.413 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <uuid>db2e8954-cb1d-4623-abe8-3c580f3c26e5</uuid>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <name>instance-0000000b</name>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-447465536</nova:name>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:30:06</nova:creationTime>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        <nova:port uuid="e855d4ad-5445-4664-b85d-8e6cd5a34cfa">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="serial">db2e8954-cb1d-4623-abe8-3c580f3c26e5</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="uuid">db2e8954-cb1d-4623-abe8-3c580f3c26e5</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.config"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:92:5f:bf"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <target dev="tape855d4ad-54"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/console.log" append="off"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:30:06 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:30:06 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:30:06 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:30:06 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.414 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Preparing to wait for external event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.414 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.415 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.415 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.416 187287 DEBUG nova.virt.libvirt.vif [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-447465536',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-447465536',id=11,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-upmbb8zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:30:03Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=db2e8954-cb1d-4623-abe8-3c580f3c26e5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.416 187287 DEBUG nova.network.os_vif_util [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.416 187287 DEBUG nova.network.os_vif_util [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.417 187287 DEBUG os_vif [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.417 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.418 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.418 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.421 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.421 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape855d4ad-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.421 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape855d4ad-54, col_values=(('external_ids', {'iface-id': 'e855d4ad-5445-4664-b85d-8e6cd5a34cfa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:92:5f:bf', 'vm-uuid': 'db2e8954-cb1d-4623-abe8-3c580f3c26e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.423 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:06 np0005544118 NetworkManager[55710]: <info>  [1764772206.4257] manager: (tape855d4ad-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.426 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.431 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.432 187287 INFO os_vif [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54')#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.478 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.478 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.479 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:92:5f:bf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.479 187287 INFO nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Using config drive#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.834 187287 INFO nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Creating config drive at /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.config#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.840 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp43pl9fnb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:06 np0005544118 nova_compute[187283]: 2025-12-03 14:30:06.963 187287 DEBUG oslo_concurrency.processutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp43pl9fnb" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:07 np0005544118 kernel: tape855d4ad-54: entered promiscuous mode
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.0379] manager: (tape855d4ad-54): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.037 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:07Z|00108|binding|INFO|Claiming lport e855d4ad-5445-4664-b85d-8e6cd5a34cfa for this chassis.
Dec  3 09:30:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:07Z|00109|binding|INFO|e855d4ad-5445-4664-b85d-8e6cd5a34cfa: Claiming fa:16:3e:92:5f:bf 10.100.0.8
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.044 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.048 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.054 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:5f:bf 10.100.0.8'], port_security=['fa:16:3e:92:5f:bf 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'db2e8954-cb1d-4623-abe8-3c580f3c26e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=e855d4ad-5445-4664-b85d-8e6cd5a34cfa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.056 104491 INFO neutron.agent.ovn.metadata.agent [-] Port e855d4ad-5445-4664-b85d-8e6cd5a34cfa in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.057 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.070 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[60c22213-7fef-4ce2-8cc9-9028eeb7f0d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.071 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:30:07 np0005544118 systemd-udevd[212781]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:30:07 np0005544118 systemd-machined[153602]: New machine qemu-9-instance-0000000b.
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.072 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.073 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0f224a3e-3a5c-444e-82ee-448da42b34dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.073 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[035b85f7-76b1-4fb2-a032-bdb293680ecb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.0889] device (tape855d4ad-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.0899] device (tape855d4ad-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.092 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[819ff779-8935-47b6-9cc3-21f0e214b905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 systemd[1]: Started Virtual Machine qemu-9-instance-0000000b.
Dec  3 09:30:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:07Z|00110|binding|INFO|Setting lport e855d4ad-5445-4664-b85d-8e6cd5a34cfa ovn-installed in OVS
Dec  3 09:30:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:07Z|00111|binding|INFO|Setting lport e855d4ad-5445-4664-b85d-8e6cd5a34cfa up in Southbound
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.108 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.108 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[62d21a45-365c-4eda-b81d-596de3cf1a66]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.143 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a5535c-0f69-43e1-9874-866f37252440]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.148 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[08ce2cd3-1c5d-413f-8bfa-fde0567398bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.1489] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.182 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[51f531cb-31f1-4f27-87b4-f01ae53d2ffa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.186 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a0964340-442d-475b-9ef1-e2c11dff6e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.2095] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.214 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a586a157-7e4e-4f2a-8700-f293aca22e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.232 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[26e7307b-3e99-408b-82a8-b500653ec557]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436281, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212813, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.247 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5eb15096-fbb9-41da-a7bf-5f62fcc80e31]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436281, 'tstamp': 436281}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212814, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.262 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9c265d-65e8-4d24-8bd1-7ed72243c070]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436281, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212815, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.291 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[895c66ce-1a3b-4a1a-9466-fd78f4139c6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.344 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d532b50c-2912-4e03-b087-50e2880c1984]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.347 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.347 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.348 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:07 np0005544118 NetworkManager[55710]: <info>  [1764772207.3890] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec  3 09:30:07 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.388 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.391 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.393 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.395 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:07Z|00112|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.398 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.398 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b816f10a-4360-4c29-b6ac-59910d4b45b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.401 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:30:07 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:07.402 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.407 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.432 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772207.4318879, db2e8954-cb1d-4623-abe8-3c580f3c26e5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.433 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] VM Started (Lifecycle Event)#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.462 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.467 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772207.4326758, db2e8954-cb1d-4623-abe8-3c580f3c26e5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.468 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.488 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.491 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.514 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.785 187287 DEBUG nova.network.neutron [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updated VIF entry in instance network info cache for port e855d4ad-5445-4664-b85d-8e6cd5a34cfa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.786 187287 DEBUG nova.network.neutron [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updating instance_info_cache with network_info: [{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.807 187287 DEBUG oslo_concurrency.lockutils [req-851c5db7-f6e0-496a-aef0-79269b158d8f req-17537f89-0d22-43d1-81aa-ff6189acabf9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.810 187287 DEBUG nova.compute.manager [req-e8d4d1d8-1d50-4745-bd6f-add868d1afb9 req-4fc47cb2-945c-4b9e-8f48-07a59bda0384 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.810 187287 DEBUG oslo_concurrency.lockutils [req-e8d4d1d8-1d50-4745-bd6f-add868d1afb9 req-4fc47cb2-945c-4b9e-8f48-07a59bda0384 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.810 187287 DEBUG oslo_concurrency.lockutils [req-e8d4d1d8-1d50-4745-bd6f-add868d1afb9 req-4fc47cb2-945c-4b9e-8f48-07a59bda0384 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.811 187287 DEBUG oslo_concurrency.lockutils [req-e8d4d1d8-1d50-4745-bd6f-add868d1afb9 req-4fc47cb2-945c-4b9e-8f48-07a59bda0384 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.811 187287 DEBUG nova.compute.manager [req-e8d4d1d8-1d50-4745-bd6f-add868d1afb9 req-4fc47cb2-945c-4b9e-8f48-07a59bda0384 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Processing event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.812 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.815 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772207.8148284, db2e8954-cb1d-4623-abe8-3c580f3c26e5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.815 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.817 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.824 187287 INFO nova.virt.libvirt.driver [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Instance spawned successfully.#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.826 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.833 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.837 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.848 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.849 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.849 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.850 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.850 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.850 187287 DEBUG nova.virt.libvirt.driver [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.861 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:30:07 np0005544118 podman[212854]: 2025-12-03 14:30:07.771978912 +0000 UTC m=+0.026921759 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:30:07 np0005544118 podman[212854]: 2025-12-03 14:30:07.877806521 +0000 UTC m=+0.132749338 container create 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.906 187287 INFO nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Took 3.97 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.907 187287 DEBUG nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:30:07 np0005544118 systemd[1]: Started libpod-conmon-7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4.scope.
Dec  3 09:30:07 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:30:07 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57b188e967b11683f297e2a30a9bc51e741953d0468802d351be7e34c8f5a8ac/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.959 187287 INFO nova.compute.manager [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Took 4.42 seconds to build instance.#033[00m
Dec  3 09:30:07 np0005544118 nova_compute[187283]: 2025-12-03 14:30:07.976 187287 DEBUG oslo_concurrency.lockutils [None req-058d534d-e8d0-49c2-a168-d46e21656ec1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:08 np0005544118 podman[212854]: 2025-12-03 14:30:08.406001186 +0000 UTC m=+0.660944033 container init 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:30:08 np0005544118 podman[212854]: 2025-12-03 14:30:08.416272439 +0000 UTC m=+0.671215256 container start 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0)
Dec  3 09:30:08 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [NOTICE]   (212873) : New worker (212875) forked
Dec  3 09:30:08 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [NOTICE]   (212873) : Loading success.
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.149 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.894 187287 DEBUG nova.compute.manager [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.894 187287 DEBUG oslo_concurrency.lockutils [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.895 187287 DEBUG oslo_concurrency.lockutils [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.895 187287 DEBUG oslo_concurrency.lockutils [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.895 187287 DEBUG nova.compute.manager [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] No waiting events found dispatching network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:30:09 np0005544118 nova_compute[187283]: 2025-12-03 14:30:09.895 187287 WARNING nova.compute.manager [req-1867afcd-d8b3-4a51-8979-9d5a8e011e6b req-6a9f5464-f0b4-429a-a684-a6da5b7b1c88 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received unexpected event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa for instance with vm_state active and task_state None.#033[00m
Dec  3 09:30:10 np0005544118 nova_compute[187283]: 2025-12-03 14:30:10.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:10 np0005544118 podman[212884]: 2025-12-03 14:30:10.831360644 +0000 UTC m=+0.061444504 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:30:11 np0005544118 nova_compute[187283]: 2025-12-03 14:30:11.424 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:12 np0005544118 nova_compute[187283]: 2025-12-03 14:30:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:13 np0005544118 podman[212901]: 2025-12-03 14:30:13.82049302 +0000 UTC m=+0.054643480 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:30:14 np0005544118 nova_compute[187283]: 2025-12-03 14:30:14.150 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:15 np0005544118 nova_compute[187283]: 2025-12-03 14:30:15.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:15 np0005544118 nova_compute[187283]: 2025-12-03 14:30:15.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:30:15 np0005544118 nova_compute[187283]: 2025-12-03 14:30:15.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:30:16 np0005544118 nova_compute[187283]: 2025-12-03 14:30:16.033 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:30:16 np0005544118 nova_compute[187283]: 2025-12-03 14:30:16.034 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:30:16 np0005544118 nova_compute[187283]: 2025-12-03 14:30:16.034 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:30:16 np0005544118 nova_compute[187283]: 2025-12-03 14:30:16.034 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:30:16 np0005544118 nova_compute[187283]: 2025-12-03 14:30:16.427 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:16 np0005544118 podman[212927]: 2025-12-03 14:30:16.860449665 +0000 UTC m=+0.087802605 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Dec  3 09:30:17 np0005544118 nova_compute[187283]: 2025-12-03 14:30:17.329 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updating instance_info_cache with network_info: [{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:30:17 np0005544118 nova_compute[187283]: 2025-12-03 14:30:17.395 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:30:17 np0005544118 nova_compute[187283]: 2025-12-03 14:30:17.396 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:30:17 np0005544118 nova_compute[187283]: 2025-12-03 14:30:17.397 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:17 np0005544118 nova_compute[187283]: 2025-12-03 14:30:17.397 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:18 np0005544118 nova_compute[187283]: 2025-12-03 14:30:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:18 np0005544118 nova_compute[187283]: 2025-12-03 14:30:18.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.153 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:30:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.650 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.651 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.652 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.652 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.724 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.802 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.804 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:30:19 np0005544118 nova_compute[187283]: 2025-12-03 14:30:19.880 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.071 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.072 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5682MB free_disk=73.3116569519043GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.073 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.073 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.147 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance db2e8954-cb1d-4623-abe8-3c580f3c26e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.148 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.148 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.194 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.215 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:30:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:20.232 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.232 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:20.233 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.241 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:30:20 np0005544118 nova_compute[187283]: 2025-12-03 14:30:20.241 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:30:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:20Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:92:5f:bf 10.100.0.8
Dec  3 09:30:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:20Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:92:5f:bf 10.100.0.8
Dec  3 09:30:21 np0005544118 nova_compute[187283]: 2025-12-03 14:30:21.432 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:23 np0005544118 nova_compute[187283]: 2025-12-03 14:30:23.241 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:30:23 np0005544118 nova_compute[187283]: 2025-12-03 14:30:23.242 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:30:24 np0005544118 nova_compute[187283]: 2025-12-03 14:30:24.155 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:30:24.235 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:30:26 np0005544118 nova_compute[187283]: 2025-12-03 14:30:26.434 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:26 np0005544118 podman[212979]: 2025-12-03 14:30:26.837449809 +0000 UTC m=+0.057750008 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter)
Dec  3 09:30:29 np0005544118 nova_compute[187283]: 2025-12-03 14:30:29.157 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:30 np0005544118 podman[213001]: 2025-12-03 14:30:30.851827948 +0000 UTC m=+0.063351938 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec  3 09:30:31 np0005544118 nova_compute[187283]: 2025-12-03 14:30:31.437 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:34 np0005544118 nova_compute[187283]: 2025-12-03 14:30:34.158 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:35 np0005544118 podman[197639]: time="2025-12-03T14:30:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:30:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:30:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:30:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:30:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec  3 09:30:36 np0005544118 nova_compute[187283]: 2025-12-03 14:30:36.440 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:39 np0005544118 nova_compute[187283]: 2025-12-03 14:30:39.160 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:41 np0005544118 nova_compute[187283]: 2025-12-03 14:30:41.443 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:41 np0005544118 podman[213026]: 2025-12-03 14:30:41.817299507 +0000 UTC m=+0.050348407 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:30:43 np0005544118 podman[213048]: 2025-12-03 14:30:43.923394448 +0000 UTC m=+0.053035974 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:30:44 np0005544118 nova_compute[187283]: 2025-12-03 14:30:44.163 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:46 np0005544118 ovn_controller[95637]: 2025-12-03T14:30:46Z|00113|memory_trim|INFO|Detected inactivity (last active 30019 ms ago): trimming memory
Dec  3 09:30:46 np0005544118 nova_compute[187283]: 2025-12-03 14:30:46.446 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:47 np0005544118 podman[213074]: 2025-12-03 14:30:47.835627952 +0000 UTC m=+0.074707782 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  3 09:30:49 np0005544118 nova_compute[187283]: 2025-12-03 14:30:49.165 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:30:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:30:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:30:51 np0005544118 nova_compute[187283]: 2025-12-03 14:30:51.448 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:54 np0005544118 nova_compute[187283]: 2025-12-03 14:30:54.167 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:56 np0005544118 nova_compute[187283]: 2025-12-03 14:30:56.451 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:30:57 np0005544118 podman[213108]: 2025-12-03 14:30:57.853331068 +0000 UTC m=+0.087100057 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, vendor=Red Hat, Inc.)
Dec  3 09:30:59 np0005544118 nova_compute[187283]: 2025-12-03 14:30:59.169 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:00.963 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:00.964 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:00.965 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:01 np0005544118 nova_compute[187283]: 2025-12-03 14:31:01.453 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:01 np0005544118 podman[213132]: 2025-12-03 14:31:01.869516317 +0000 UTC m=+0.101989130 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Dec  3 09:31:04 np0005544118 nova_compute[187283]: 2025-12-03 14:31:04.171 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:05 np0005544118 podman[197639]: time="2025-12-03T14:31:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:31:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:31:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:31:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:31:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  3 09:31:06 np0005544118 nova_compute[187283]: 2025-12-03 14:31:06.455 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:09 np0005544118 nova_compute[187283]: 2025-12-03 14:31:09.173 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:10 np0005544118 nova_compute[187283]: 2025-12-03 14:31:10.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:11 np0005544118 nova_compute[187283]: 2025-12-03 14:31:11.457 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:11 np0005544118 nova_compute[187283]: 2025-12-03 14:31:11.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:11 np0005544118 nova_compute[187283]: 2025-12-03 14:31:11.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:31:12 np0005544118 podman[213152]: 2025-12-03 14:31:12.819421111 +0000 UTC m=+0.051627443 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  3 09:31:13 np0005544118 nova_compute[187283]: 2025-12-03 14:31:13.630 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:14 np0005544118 nova_compute[187283]: 2025-12-03 14:31:14.174 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:14 np0005544118 podman[213172]: 2025-12-03 14:31:14.849876034 +0000 UTC m=+0.072215511 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.864 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.864 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.864 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:31:15 np0005544118 nova_compute[187283]: 2025-12-03 14:31:15.865 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:31:16 np0005544118 nova_compute[187283]: 2025-12-03 14:31:16.505 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.255 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updating instance_info_cache with network_info: [{"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.274 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-db2e8954-cb1d-4623-abe8-3c580f3c26e5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.274 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.274 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.275 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:17 np0005544118 nova_compute[187283]: 2025-12-03 14:31:17.619 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:18 np0005544118 nova_compute[187283]: 2025-12-03 14:31:18.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:18 np0005544118 podman[213197]: 2025-12-03 14:31:18.848508134 +0000 UTC m=+0.081110695 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:31:19 np0005544118 nova_compute[187283]: 2025-12-03 14:31:19.046 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Creating tmpfile /var/lib/nova/instances/tmp1dx2mgzj to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:31:19 np0005544118 nova_compute[187283]: 2025-12-03 14:31:19.047 187287 DEBUG nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dx2mgzj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:31:19 np0005544118 nova_compute[187283]: 2025-12-03 14:31:19.175 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:31:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.472 187287 DEBUG nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dx2mgzj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4cd2af63-02b0-43f3-82a8-384a03824246',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.507 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.508 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.508 187287 DEBUG nova.network.neutron [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.625 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.769 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.793 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Triggering sync for uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.793 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.794 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:20 np0005544118 nova_compute[187283]: 2025-12-03 14:31:20.815 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.508 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.626 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.627 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.627 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.627 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.678 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.749 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.750 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.829 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.978 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.979 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5691MB free_disk=73.30776977539062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.979 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:21 np0005544118 nova_compute[187283]: 2025-12-03 14:31:21.980 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.035 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Migration for instance 4cd2af63-02b0-43f3-82a8-384a03824246 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.054 187287 INFO nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Updating resource usage from migration f9cf1c8a-a742-4dec-b2cb-3466a4d095da#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.054 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Starting to track incoming migration f9cf1c8a-a742-4dec-b2cb-3466a4d095da with flavor ec610f84-c649-49d7-9c7a-a22befc31fb8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.133 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance db2e8954-cb1d-4623-abe8-3c580f3c26e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.153 187287 WARNING nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 4cd2af63-02b0-43f3-82a8-384a03824246 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.153 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.154 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.216 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.478 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.554 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:31:22 np0005544118 nova_compute[187283]: 2025-12-03 14:31:22.555 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.575s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.034 187287 DEBUG nova.network.neutron [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Updating instance_info_cache with network_info: [{"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.049 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.051 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dx2mgzj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4cd2af63-02b0-43f3-82a8-384a03824246',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.051 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Creating instance directory: /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.052 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Creating disk.info with the contents: {'/var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk': 'qcow2', '/var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.052 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.053 187287 DEBUG nova.objects.instance [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4cd2af63-02b0-43f3-82a8-384a03824246 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.079 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.129 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.130 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.130 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.140 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.203 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.204 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.537 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk 1073741824" returned: 0 in 0.333s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.538 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.408s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.539 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.555 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.556 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.591 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.591 187287 DEBUG nova.virt.disk.api [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.592 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.643 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.644 187287 DEBUG nova.virt.disk.api [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.644 187287 DEBUG nova.objects.instance [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 4cd2af63-02b0-43f3-82a8-384a03824246 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.662 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.684 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.685 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config to /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:31:23 np0005544118 nova_compute[187283]: 2025-12-03 14:31:23.686 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.118 187287 DEBUG oslo_concurrency.processutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246/disk.config /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.119 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.120 187287 DEBUG nova.virt.libvirt.vif [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:30:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1011182355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1011182355',id=12,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:30:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-qf6k2vtd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:30:19Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=4cd2af63-02b0-43f3-82a8-384a03824246,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.121 187287 DEBUG nova.network.os_vif_util [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.122 187287 DEBUG nova.network.os_vif_util [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.123 187287 DEBUG os_vif [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.124 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.125 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.129 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbfd5243c-59, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.129 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbfd5243c-59, col_values=(('external_ids', {'iface-id': 'bfd5243c-59f8-42f6-bb9a-edb7d4d6e106', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:5d:0c', 'vm-uuid': '4cd2af63-02b0-43f3-82a8-384a03824246'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.131 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:24 np0005544118 NetworkManager[55710]: <info>  [1764772284.1324] manager: (tapbfd5243c-59): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.134 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.137 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.138 187287 INFO os_vif [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59')#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.138 187287 DEBUG nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.139 187287 DEBUG nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dx2mgzj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4cd2af63-02b0-43f3-82a8-384a03824246',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:31:24 np0005544118 nova_compute[187283]: 2025-12-03 14:31:24.177 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:26.515 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:31:26 np0005544118 nova_compute[187283]: 2025-12-03 14:31:26.516 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:26.517 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:31:26 np0005544118 nova_compute[187283]: 2025-12-03 14:31:26.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:31:26 np0005544118 nova_compute[187283]: 2025-12-03 14:31:26.924 187287 DEBUG nova.network.neutron [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Port bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:31:26 np0005544118 nova_compute[187283]: 2025-12-03 14:31:26.926 187287 DEBUG nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1dx2mgzj',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4cd2af63-02b0-43f3-82a8-384a03824246',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:31:27 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:31:27 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:31:27 np0005544118 NetworkManager[55710]: <info>  [1764772287.4119] manager: (tapbfd5243c-59): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec  3 09:31:27 np0005544118 kernel: tapbfd5243c-59: entered promiscuous mode
Dec  3 09:31:27 np0005544118 nova_compute[187283]: 2025-12-03 14:31:27.420 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:27Z|00114|binding|INFO|Claiming lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 for this additional chassis.
Dec  3 09:31:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:27Z|00115|binding|INFO|bfd5243c-59f8-42f6-bb9a-edb7d4d6e106: Claiming fa:16:3e:40:5d:0c 10.100.0.5
Dec  3 09:31:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:27Z|00116|binding|INFO|Setting lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 ovn-installed in OVS
Dec  3 09:31:27 np0005544118 nova_compute[187283]: 2025-12-03 14:31:27.438 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:27 np0005544118 systemd-udevd[213298]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:31:27 np0005544118 systemd-machined[153602]: New machine qemu-10-instance-0000000c.
Dec  3 09:31:27 np0005544118 NetworkManager[55710]: <info>  [1764772287.4554] device (tapbfd5243c-59): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:31:27 np0005544118 NetworkManager[55710]: <info>  [1764772287.4575] device (tapbfd5243c-59): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:31:27 np0005544118 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Dec  3 09:31:28 np0005544118 nova_compute[187283]: 2025-12-03 14:31:28.073 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772288.0734243, 4cd2af63-02b0-43f3-82a8-384a03824246 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:31:28 np0005544118 nova_compute[187283]: 2025-12-03 14:31:28.074 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] VM Started (Lifecycle Event)#033[00m
Dec  3 09:31:28 np0005544118 nova_compute[187283]: 2025-12-03 14:31:28.094 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:31:28 np0005544118 podman[213329]: 2025-12-03 14:31:28.879605981 +0000 UTC m=+0.096082432 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc.)
Dec  3 09:31:29 np0005544118 nova_compute[187283]: 2025-12-03 14:31:29.133 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:29 np0005544118 nova_compute[187283]: 2025-12-03 14:31:29.179 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:30 np0005544118 nova_compute[187283]: 2025-12-03 14:31:30.061 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772290.060764, 4cd2af63-02b0-43f3-82a8-384a03824246 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:31:30 np0005544118 nova_compute[187283]: 2025-12-03 14:31:30.061 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:31:30 np0005544118 nova_compute[187283]: 2025-12-03 14:31:30.080 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:31:30 np0005544118 nova_compute[187283]: 2025-12-03 14:31:30.083 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:31:30 np0005544118 nova_compute[187283]: 2025-12-03 14:31:30.105 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:31:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:32Z|00117|binding|INFO|Claiming lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 for this chassis.
Dec  3 09:31:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:32Z|00118|binding|INFO|bfd5243c-59f8-42f6-bb9a-edb7d4d6e106: Claiming fa:16:3e:40:5d:0c 10.100.0.5
Dec  3 09:31:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:32Z|00119|binding|INFO|Setting lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 up in Southbound
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.341 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:5d:0c 10.100.0.5'], port_security=['fa:16:3e:40:5d:0c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4cd2af63-02b0-43f3-82a8-384a03824246', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.344 104491 INFO neutron.agent.ovn.metadata.agent [-] Port bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.347 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.374 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1714aecb-e458-436d-8f83-b54a914cee4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.472 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[6a067e36-5066-4380-8a17-7de42852da45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 nova_compute[187283]: 2025-12-03 14:31:32.477 187287 INFO nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Post operation of migration started#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.477 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[b71e53db-882b-4d20-9c32-e763b5e8f5e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.519 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.525 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[e72a6a0b-e5b8-47f0-a99c-ffed52b940ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.551 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ea24f776-4a4b-4495-b384-bdb7a1d4bc1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436281, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213357, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.573 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[81104da0-20db-4c7b-9e54-a1effbd93f89]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436292, 'tstamp': 436292}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213358, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436294, 'tstamp': 436294}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213358, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.575 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:32 np0005544118 nova_compute[187283]: 2025-12-03 14:31:32.577 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.578 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.578 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.579 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:32.579 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:31:32 np0005544118 nova_compute[187283]: 2025-12-03 14:31:32.702 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:31:32 np0005544118 nova_compute[187283]: 2025-12-03 14:31:32.703 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:31:32 np0005544118 nova_compute[187283]: 2025-12-03 14:31:32.703 187287 DEBUG nova.network.neutron [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:31:32 np0005544118 podman[213359]: 2025-12-03 14:31:32.849237013 +0000 UTC m=+0.071268174 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.179 187287 DEBUG nova.network.neutron [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Updating instance_info_cache with network_info: [{"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.181 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.236 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-4cd2af63-02b0-43f3-82a8-384a03824246" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.256 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.257 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.257 187287 DEBUG oslo_concurrency.lockutils [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:34 np0005544118 nova_compute[187283]: 2025-12-03 14:31:34.262 187287 INFO nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:31:34 np0005544118 virtqemud[186958]: Domain id=10 name='instance-0000000c' uuid=4cd2af63-02b0-43f3-82a8-384a03824246 is tainted: custom-monitor
Dec  3 09:31:35 np0005544118 nova_compute[187283]: 2025-12-03 14:31:35.270 187287 INFO nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:31:35 np0005544118 podman[197639]: time="2025-12-03T14:31:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:31:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:31:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:31:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:31:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Dec  3 09:31:36 np0005544118 nova_compute[187283]: 2025-12-03 14:31:36.278 187287 INFO nova.virt.libvirt.driver [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:31:36 np0005544118 nova_compute[187283]: 2025-12-03 14:31:36.283 187287 DEBUG nova.compute.manager [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:31:36 np0005544118 nova_compute[187283]: 2025-12-03 14:31:36.306 187287 DEBUG nova.objects.instance [None req-8987a5a6-5b12-4aa3-9155-0ef8e44e1944 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:31:39 np0005544118 nova_compute[187283]: 2025-12-03 14:31:39.141 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:39 np0005544118 nova_compute[187283]: 2025-12-03 14:31:39.183 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.850 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "4cd2af63-02b0-43f3-82a8-384a03824246" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.851 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.851 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.851 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.851 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.852 187287 INFO nova.compute.manager [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Terminating instance#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.853 187287 DEBUG nova.compute.manager [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:31:40 np0005544118 kernel: tapbfd5243c-59 (unregistering): left promiscuous mode
Dec  3 09:31:40 np0005544118 NetworkManager[55710]: <info>  [1764772300.8798] device (tapbfd5243c-59): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:31:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:40Z|00120|binding|INFO|Releasing lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 from this chassis (sb_readonly=0)
Dec  3 09:31:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:40Z|00121|binding|INFO|Setting lport bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 down in Southbound
Dec  3 09:31:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:40Z|00122|binding|INFO|Removing iface tapbfd5243c-59 ovn-installed in OVS
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.889 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.892 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.898 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:5d:0c 10.100.0.5'], port_security=['fa:16:3e:40:5d:0c 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4cd2af63-02b0-43f3-82a8-384a03824246', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.899 104491 INFO neutron.agent.ovn.metadata.agent [-] Port bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.900 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:31:40 np0005544118 nova_compute[187283]: 2025-12-03 14:31:40.902 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.918 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d10e56a2-6430-4100-a486-932eef3537bb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:40 np0005544118 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec  3 09:31:40 np0005544118 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 1.655s CPU time.
Dec  3 09:31:40 np0005544118 systemd-machined[153602]: Machine qemu-10-instance-0000000c terminated.
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.945 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[9b52b03c-d71d-4fba-bf6f-88ca32ba1387]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.948 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0bf7ea83-da83-48c6-a96f-6a14ffab89b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.973 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[68ff8a1e-8cc0-474b-bd2c-0c8c0f89ed26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:40.986 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9abb9fb2-db19-491e-8e26-5e62542891b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436281, 'reachable_time': 42683, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213391, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.001 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3be964-c2e6-4a54-8aeb-9745796bf360]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436292, 'tstamp': 436292}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213392, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 436294, 'tstamp': 436294}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213392, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.002 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.050 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.053 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.054 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.055 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.055 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:41.055 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.112 187287 INFO nova.virt.libvirt.driver [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Instance destroyed successfully.#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.113 187287 DEBUG nova.objects.instance [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 4cd2af63-02b0-43f3-82a8-384a03824246 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.127 187287 DEBUG nova.virt.libvirt.vif [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:30:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1011182355',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1011182355',id=12,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:30:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-qf6k2vtd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:31:36Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=4cd2af63-02b0-43f3-82a8-384a03824246,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.128 187287 DEBUG nova.network.os_vif_util [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "address": "fa:16:3e:40:5d:0c", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbfd5243c-59", "ovs_interfaceid": "bfd5243c-59f8-42f6-bb9a-edb7d4d6e106", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.129 187287 DEBUG nova.network.os_vif_util [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.129 187287 DEBUG os_vif [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.132 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.132 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbfd5243c-59, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.135 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.141 187287 INFO os_vif [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:5d:0c,bridge_name='br-int',has_traffic_filtering=True,id=bfd5243c-59f8-42f6-bb9a-edb7d4d6e106,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbfd5243c-59')#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.142 187287 INFO nova.virt.libvirt.driver [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Deleting instance files /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246_del#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.142 187287 INFO nova.virt.libvirt.driver [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Deletion of /var/lib/nova/instances/4cd2af63-02b0-43f3-82a8-384a03824246_del complete#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.191 187287 INFO nova.compute.manager [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.192 187287 DEBUG oslo.service.loopingcall [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.192 187287 DEBUG nova.compute.manager [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.192 187287 DEBUG nova.network.neutron [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.201 187287 DEBUG nova.compute.manager [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Received event network-vif-unplugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.201 187287 DEBUG oslo_concurrency.lockutils [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.201 187287 DEBUG oslo_concurrency.lockutils [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.201 187287 DEBUG oslo_concurrency.lockutils [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.202 187287 DEBUG nova.compute.manager [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] No waiting events found dispatching network-vif-unplugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.202 187287 DEBUG nova.compute.manager [req-0551aab0-3cfe-43f2-8336-2523b8518cc7 req-1944877f-b0b4-402e-8450-2962e13902a1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Received event network-vif-unplugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.790 187287 DEBUG nova.network.neutron [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.809 187287 INFO nova.compute.manager [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Took 0.62 seconds to deallocate network for instance.#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.856 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.857 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.863 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.871 187287 DEBUG nova.compute.manager [req-330dd808-05ee-4969-9ae1-52b83cf9cf26 req-ddc26a08-9568-4ecd-b8c9-c987966474f8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Received event network-vif-deleted-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:41 np0005544118 nova_compute[187283]: 2025-12-03 14:31:41.901 187287 INFO nova.scheduler.client.report [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 4cd2af63-02b0-43f3-82a8-384a03824246#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.686 187287 DEBUG oslo_concurrency.lockutils [None req-a85bf788-74a4-40f4-822f-c77db5aafbe6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.835s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.943 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.944 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.944 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.944 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.945 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.946 187287 INFO nova.compute.manager [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Terminating instance#033[00m
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.947 187287 DEBUG nova.compute.manager [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:31:42 np0005544118 kernel: tape855d4ad-54 (unregistering): left promiscuous mode
Dec  3 09:31:42 np0005544118 NetworkManager[55710]: <info>  [1764772302.9788] device (tape855d4ad-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:31:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:42Z|00123|binding|INFO|Releasing lport e855d4ad-5445-4664-b85d-8e6cd5a34cfa from this chassis (sb_readonly=0)
Dec  3 09:31:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:42Z|00124|binding|INFO|Setting lport e855d4ad-5445-4664-b85d-8e6cd5a34cfa down in Southbound
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.984 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:31:42Z|00125|binding|INFO|Removing iface tape855d4ad-54 ovn-installed in OVS
Dec  3 09:31:42 np0005544118 nova_compute[187283]: 2025-12-03 14:31:42.986 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.000 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Dec  3 09:31:43 np0005544118 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000b.scope: Consumed 16.275s CPU time.
Dec  3 09:31:43 np0005544118 systemd-machined[153602]: Machine qemu-9-instance-0000000b terminated.
Dec  3 09:31:43 np0005544118 podman[213414]: 2025-12-03 14:31:43.068024574 +0000 UTC m=+0.052255782 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.103 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:92:5f:bf 10.100.0.8'], port_security=['fa:16:3e:92:5f:bf 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'db2e8954-cb1d-4623-abe8-3c580f3c26e5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=e855d4ad-5445-4664-b85d-8e6cd5a34cfa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.105 104491 INFO neutron.agent.ovn.metadata.agent [-] Port e855d4ad-5445-4664-b85d-8e6cd5a34cfa in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.106 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.107 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc924af-f339-4f99-a8e2-fddb7745425d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.107 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:31:43 np0005544118 kernel: tape855d4ad-54: entered promiscuous mode
Dec  3 09:31:43 np0005544118 kernel: tape855d4ad-54 (unregistering): left promiscuous mode
Dec  3 09:31:43 np0005544118 NetworkManager[55710]: <info>  [1764772303.1711] manager: (tape855d4ad-54): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.223 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [NOTICE]   (212873) : haproxy version is 2.8.14-c23fe91
Dec  3 09:31:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [NOTICE]   (212873) : path to executable is /usr/sbin/haproxy
Dec  3 09:31:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [WARNING]  (212873) : Exiting Master process...
Dec  3 09:31:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [ALERT]    (212873) : Current worker (212875) exited with code 143 (Terminated)
Dec  3 09:31:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[212869]: [WARNING]  (212873) : All workers exited. Exiting... (0)
Dec  3 09:31:43 np0005544118 systemd[1]: libpod-7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4.scope: Deactivated successfully.
Dec  3 09:31:43 np0005544118 podman[213461]: 2025-12-03 14:31:43.24175702 +0000 UTC m=+0.046724015 container died 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.252 187287 INFO nova.virt.libvirt.driver [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Instance destroyed successfully.#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.252 187287 DEBUG nova.objects.instance [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid db2e8954-cb1d-4623-abe8-3c580f3c26e5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:31:43 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4-userdata-shm.mount: Deactivated successfully.
Dec  3 09:31:43 np0005544118 systemd[1]: var-lib-containers-storage-overlay-57b188e967b11683f297e2a30a9bc51e741953d0468802d351be7e34c8f5a8ac-merged.mount: Deactivated successfully.
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.286 187287 DEBUG nova.virt.libvirt.vif [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:30:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-447465536',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-447465536',id=11,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-upmbb8zv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:30:07Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=db2e8954-cb1d-4623-abe8-3c580f3c26e5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.288 187287 DEBUG nova.network.os_vif_util [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "address": "fa:16:3e:92:5f:bf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape855d4ad-54", "ovs_interfaceid": "e855d4ad-5445-4664-b85d-8e6cd5a34cfa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.289 187287 DEBUG nova.network.os_vif_util [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.290 187287 DEBUG os_vif [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:31:43 np0005544118 podman[213461]: 2025-12-03 14:31:43.291270132 +0000 UTC m=+0.096237127 container cleanup 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.291 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.292 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape855d4ad-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.293 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.296 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.299 187287 INFO os_vif [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:92:5f:bf,bridge_name='br-int',has_traffic_filtering=True,id=e855d4ad-5445-4664-b85d-8e6cd5a34cfa,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape855d4ad-54')#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.299 187287 INFO nova.virt.libvirt.driver [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Deleting instance files /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5_del#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.300 187287 INFO nova.virt.libvirt.driver [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Deletion of /var/lib/nova/instances/db2e8954-cb1d-4623-abe8-3c580f3c26e5_del complete#033[00m
Dec  3 09:31:43 np0005544118 systemd[1]: libpod-conmon-7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4.scope: Deactivated successfully.
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.329 187287 DEBUG nova.compute.manager [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Received event network-vif-plugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.330 187287 DEBUG oslo_concurrency.lockutils [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.331 187287 DEBUG oslo_concurrency.lockutils [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.331 187287 DEBUG oslo_concurrency.lockutils [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4cd2af63-02b0-43f3-82a8-384a03824246-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.331 187287 DEBUG nova.compute.manager [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] No waiting events found dispatching network-vif-plugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.331 187287 WARNING nova.compute.manager [req-15766ea0-4f3a-48b1-b035-d654c45023f4 req-7fc4600a-b8b0-46ef-a47d-6460e4c995b8 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Received unexpected event network-vif-plugged-bfd5243c-59f8-42f6-bb9a-edb7d4d6e106 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:31:43 np0005544118 podman[213497]: 2025-12-03 14:31:43.362785291 +0000 UTC m=+0.049706319 container remove 7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.369 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f8185df3-2e59-430c-9a03-f94b5771120c]: (4, ('Wed Dec  3 02:31:43 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4)\n7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4\nWed Dec  3 02:31:43 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4)\n7f8d59b85d8fea8ec4fa80ad4b3646ea5a62cde9d96321a5dfc979f18ed874b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.371 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aee1bee9-199d-4304-bd7d-7efe03abcf4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.372 187287 INFO nova.compute.manager [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.372 187287 DEBUG oslo.service.loopingcall [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.373 187287 DEBUG nova.compute.manager [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.373 187287 DEBUG nova.network.neutron [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.373 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.375 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.387 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.389 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.390 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a8916cb9-b43b-4489-92c0-4174ec8b9e60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.402 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3b86899b-f1aa-43cd-9bb0-655e7cd301fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.403 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3d530bdd-f1df-4007-8275-26b1471d07ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.421 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d0d86c2a-05b1-42e6-8e74-bffd9f6df31e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 436274, 'reachable_time': 36642, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213512, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.424 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:31:43 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:31:43.425 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f99800-19cf-488c-81a3-8d46fd80e91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.947 187287 DEBUG nova.compute.manager [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-unplugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.947 187287 DEBUG oslo_concurrency.lockutils [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.948 187287 DEBUG oslo_concurrency.lockutils [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.948 187287 DEBUG oslo_concurrency.lockutils [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.948 187287 DEBUG nova.compute.manager [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] No waiting events found dispatching network-vif-unplugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:31:43 np0005544118 nova_compute[187283]: 2025-12-03 14:31:43.949 187287 DEBUG nova.compute.manager [req-4778e902-44de-4722-91ff-c4e324c53f27 req-8873d2bf-366d-4df0-8ebb-e60c6facf891 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-unplugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.185 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.434 187287 DEBUG nova.network.neutron [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.466 187287 INFO nova.compute.manager [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Took 1.09 seconds to deallocate network for instance.#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.512 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.513 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.608 187287 DEBUG nova.compute.provider_tree [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.622 187287 DEBUG nova.scheduler.client.report [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.643 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.666 187287 INFO nova.scheduler.client.report [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance db2e8954-cb1d-4623-abe8-3c580f3c26e5#033[00m
Dec  3 09:31:44 np0005544118 nova_compute[187283]: 2025-12-03 14:31:44.888 187287 DEBUG oslo_concurrency.lockutils [None req-2cfdb896-2b9d-417e-b559-17c12b3628d6 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.944s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:45 np0005544118 podman[213514]: 2025-12-03 14:31:45.851487664 +0000 UTC m=+0.078028796 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.033 187287 DEBUG nova.compute.manager [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 DEBUG oslo_concurrency.lockutils [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 DEBUG oslo_concurrency.lockutils [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 DEBUG oslo_concurrency.lockutils [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "db2e8954-cb1d-4623-abe8-3c580f3c26e5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 DEBUG nova.compute.manager [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] No waiting events found dispatching network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 WARNING nova.compute.manager [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received unexpected event network-vif-plugged-e855d4ad-5445-4664-b85d-8e6cd5a34cfa for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:31:46 np0005544118 nova_compute[187283]: 2025-12-03 14:31:46.034 187287 DEBUG nova.compute.manager [req-f1930daf-e2d7-45e3-bd6d-7008ab62c037 req-71aaae51-b40d-4161-98d2-91946201e76b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Received event network-vif-deleted-e855d4ad-5445-4664-b85d-8e6cd5a34cfa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:31:48 np0005544118 nova_compute[187283]: 2025-12-03 14:31:48.293 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:49 np0005544118 nova_compute[187283]: 2025-12-03 14:31:49.187 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:31:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:31:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:31:49 np0005544118 podman[213538]: 2025-12-03 14:31:49.846631214 +0000 UTC m=+0.081983620 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  3 09:31:53 np0005544118 nova_compute[187283]: 2025-12-03 14:31:53.296 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:54 np0005544118 nova_compute[187283]: 2025-12-03 14:31:54.189 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:56 np0005544118 nova_compute[187283]: 2025-12-03 14:31:56.110 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772301.1090112, 4cd2af63-02b0-43f3-82a8-384a03824246 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:31:56 np0005544118 nova_compute[187283]: 2025-12-03 14:31:56.110 187287 INFO nova.compute.manager [-] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:31:56 np0005544118 nova_compute[187283]: 2025-12-03 14:31:56.131 187287 DEBUG nova.compute.manager [None req-1891c809-c108-4ece-82a8-6621a9fc665d - - - - - -] [instance: 4cd2af63-02b0-43f3-82a8-384a03824246] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:31:58 np0005544118 nova_compute[187283]: 2025-12-03 14:31:58.249 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772303.2477996, db2e8954-cb1d-4623-abe8-3c580f3c26e5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:31:58 np0005544118 nova_compute[187283]: 2025-12-03 14:31:58.250 187287 INFO nova.compute.manager [-] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:31:58 np0005544118 nova_compute[187283]: 2025-12-03 14:31:58.277 187287 DEBUG nova.compute.manager [None req-2361dc95-4ab3-41ea-937c-50d28c4a62c2 - - - - - -] [instance: db2e8954-cb1d-4623-abe8-3c580f3c26e5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:31:58 np0005544118 nova_compute[187283]: 2025-12-03 14:31:58.298 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:59 np0005544118 nova_compute[187283]: 2025-12-03 14:31:59.191 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:31:59 np0005544118 podman[213564]: 2025-12-03 14:31:59.820407715 +0000 UTC m=+0.053370963 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, name=ubi9-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, config_id=edpm, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:32:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:00.964 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:00.965 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:00.965 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:03 np0005544118 nova_compute[187283]: 2025-12-03 14:32:03.299 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:03 np0005544118 podman[213584]: 2025-12-03 14:32:03.829018959 +0000 UTC m=+0.064225432 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  3 09:32:04 np0005544118 nova_compute[187283]: 2025-12-03 14:32:04.192 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:05 np0005544118 podman[197639]: time="2025-12-03T14:32:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:32:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:32:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:32:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:32:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec  3 09:32:08 np0005544118 nova_compute[187283]: 2025-12-03 14:32:08.301 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:09 np0005544118 nova_compute[187283]: 2025-12-03 14:32:09.235 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:12 np0005544118 nova_compute[187283]: 2025-12-03 14:32:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:13 np0005544118 nova_compute[187283]: 2025-12-03 14:32:13.303 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:13 np0005544118 nova_compute[187283]: 2025-12-03 14:32:13.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:13 np0005544118 podman[213607]: 2025-12-03 14:32:13.82878149 +0000 UTC m=+0.056540008 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:32:14 np0005544118 nova_compute[187283]: 2025-12-03 14:32:14.240 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:14 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:14Z|00126|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Dec  3 09:32:16 np0005544118 nova_compute[187283]: 2025-12-03 14:32:16.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:16 np0005544118 nova_compute[187283]: 2025-12-03 14:32:16.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:32:16 np0005544118 nova_compute[187283]: 2025-12-03 14:32:16.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:32:16 np0005544118 podman[213627]: 2025-12-03 14:32:16.825442823 +0000 UTC m=+0.056860977 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:32:16 np0005544118 nova_compute[187283]: 2025-12-03 14:32:16.831 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:32:17 np0005544118 nova_compute[187283]: 2025-12-03 14:32:17.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:18 np0005544118 nova_compute[187283]: 2025-12-03 14:32:18.304 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:19 np0005544118 nova_compute[187283]: 2025-12-03 14:32:19.275 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:32:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:32:19 np0005544118 nova_compute[187283]: 2025-12-03 14:32:19.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:20 np0005544118 nova_compute[187283]: 2025-12-03 14:32:20.601 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:20 np0005544118 nova_compute[187283]: 2025-12-03 14:32:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:20 np0005544118 podman[213652]: 2025-12-03 14:32:20.837416844 +0000 UTC m=+0.072535931 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec  3 09:32:21 np0005544118 nova_compute[187283]: 2025-12-03 14:32:21.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.065 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.065 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.065 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.066 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.228 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.229 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5878MB free_disk=73.33626174926758GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.229 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.229 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.620 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.621 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.641 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:32:22 np0005544118 nova_compute[187283]: 2025-12-03 14:32:22.739 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:32:23 np0005544118 nova_compute[187283]: 2025-12-03 14:32:23.067 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:32:23 np0005544118 nova_compute[187283]: 2025-12-03 14:32:23.067 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:23 np0005544118 nova_compute[187283]: 2025-12-03 14:32:23.306 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:24 np0005544118 nova_compute[187283]: 2025-12-03 14:32:24.314 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:25 np0005544118 nova_compute[187283]: 2025-12-03 14:32:25.066 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:32:25 np0005544118 nova_compute[187283]: 2025-12-03 14:32:25.067 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:32:28 np0005544118 nova_compute[187283]: 2025-12-03 14:32:28.308 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:29 np0005544118 nova_compute[187283]: 2025-12-03 14:32:29.317 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:30 np0005544118 nova_compute[187283]: 2025-12-03 14:32:30.455 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:30.455 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:32:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:30.456 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:32:30 np0005544118 podman[213682]: 2025-12-03 14:32:30.830141317 +0000 UTC m=+0.055254503 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1755695350, managed_by=edpm_ansible, version=9.6)
Dec  3 09:32:33 np0005544118 nova_compute[187283]: 2025-12-03 14:32:33.310 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:34 np0005544118 nova_compute[187283]: 2025-12-03 14:32:34.318 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:34 np0005544118 podman[213709]: 2025-12-03 14:32:34.821989918 +0000 UTC m=+0.051260330 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:32:35 np0005544118 podman[197639]: time="2025-12-03T14:32:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:32:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:32:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:32:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:32:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2589 "" "Go-http-client/1.1"
Dec  3 09:32:36 np0005544118 nova_compute[187283]: 2025-12-03 14:32:36.990 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:36 np0005544118 nova_compute[187283]: 2025-12-03 14:32:36.990 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.015 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.089 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.090 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.095 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.096 187287 INFO nova.compute.claims [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.192 187287 DEBUG nova.compute.provider_tree [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.207 187287 DEBUG nova.scheduler.client.report [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.229 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.230 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.275 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.276 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.296 187287 INFO nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.315 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.409 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.411 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.411 187287 INFO nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Creating image(s)#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.412 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.412 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.413 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.426 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.484 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.485 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.485 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.495 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.543 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.544 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.574 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.575 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.576 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.629 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.630 187287 DEBUG nova.virt.disk.api [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.631 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.687 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.688 187287 DEBUG nova.virt.disk.api [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.689 187287 DEBUG nova.objects.instance [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.706 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.706 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Ensure instance console log exists: /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.707 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.707 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:37 np0005544118 nova_compute[187283]: 2025-12-03 14:32:37.707 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:38 np0005544118 nova_compute[187283]: 2025-12-03 14:32:38.066 187287 DEBUG nova.policy [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:32:38 np0005544118 nova_compute[187283]: 2025-12-03 14:32:38.312 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:38 np0005544118 nova_compute[187283]: 2025-12-03 14:32:38.728 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Successfully created port: 56fb9930-d77f-410f-bc18-3de929c7cd78 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:32:39 np0005544118 nova_compute[187283]: 2025-12-03 14:32:39.319 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:39.458 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.047 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Successfully updated port: 56fb9930-d77f-410f-bc18-3de929c7cd78 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.065 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.065 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.066 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.155 187287 DEBUG nova.compute.manager [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-changed-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.156 187287 DEBUG nova.compute.manager [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Refreshing instance network info cache due to event network-changed-56fb9930-d77f-410f-bc18-3de929c7cd78. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.156 187287 DEBUG oslo_concurrency.lockutils [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:32:40 np0005544118 nova_compute[187283]: 2025-12-03 14:32:40.236 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.382 187287 DEBUG nova.network.neutron [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updating instance_info_cache with network_info: [{"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.541 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.541 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Instance network_info: |[{"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.542 187287 DEBUG oslo_concurrency.lockutils [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.543 187287 DEBUG nova.network.neutron [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Refreshing network info cache for port 56fb9930-d77f-410f-bc18-3de929c7cd78 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.547 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Start _get_guest_xml network_info=[{"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.552 187287 WARNING nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.558 187287 DEBUG nova.virt.libvirt.host [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.559 187287 DEBUG nova.virt.libvirt.host [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.568 187287 DEBUG nova.virt.libvirt.host [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.569 187287 DEBUG nova.virt.libvirt.host [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.570 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.570 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.571 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.571 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.571 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.572 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.572 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.572 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.572 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.573 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.573 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.573 187287 DEBUG nova.virt.hardware [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.578 187287 DEBUG nova.virt.libvirt.vif [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799678532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799678532',id=14,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-gchw7dba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:32:37Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=dcc96fb6-cdff-49c0-b6cc-2cdd526e4096,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.578 187287 DEBUG nova.network.os_vif_util [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.579 187287 DEBUG nova.network.os_vif_util [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.580 187287 DEBUG nova.objects.instance [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.596 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <uuid>dcc96fb6-cdff-49c0-b6cc-2cdd526e4096</uuid>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <name>instance-0000000e</name>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-1799678532</nova:name>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:32:41</nova:creationTime>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        <nova:port uuid="56fb9930-d77f-410f-bc18-3de929c7cd78">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="serial">dcc96fb6-cdff-49c0-b6cc-2cdd526e4096</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="uuid">dcc96fb6-cdff-49c0-b6cc-2cdd526e4096</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.config"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:2e:3b:cf"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <target dev="tap56fb9930-d7"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/console.log" append="off"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:32:41 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:32:41 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:32:41 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:32:41 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.597 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Preparing to wait for external event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.598 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.598 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.598 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.599 187287 DEBUG nova.virt.libvirt.vif [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799678532',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799678532',id=14,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-gchw7dba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:32:37Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=dcc96fb6-cdff-49c0-b6cc-2cdd526e4096,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.599 187287 DEBUG nova.network.os_vif_util [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.600 187287 DEBUG nova.network.os_vif_util [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.600 187287 DEBUG os_vif [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.601 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.601 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.601 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.604 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.604 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56fb9930-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.604 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56fb9930-d7, col_values=(('external_ids', {'iface-id': '56fb9930-d77f-410f-bc18-3de929c7cd78', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:3b:cf', 'vm-uuid': 'dcc96fb6-cdff-49c0-b6cc-2cdd526e4096'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.606 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:41 np0005544118 NetworkManager[55710]: <info>  [1764772361.6071] manager: (tap56fb9930-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.608 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.612 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.613 187287 INFO os_vif [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7')#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.662 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.662 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.662 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:2e:3b:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:32:41 np0005544118 nova_compute[187283]: 2025-12-03 14:32:41.663 187287 INFO nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Using config drive#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.138 187287 INFO nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Creating config drive at /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.config#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.144 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0gfc35dh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.266 187287 DEBUG oslo_concurrency.processutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0gfc35dh" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:32:42 np0005544118 kernel: tap56fb9930-d7: entered promiscuous mode
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.3235] manager: (tap56fb9930-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/53)
Dec  3 09:32:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:42Z|00127|binding|INFO|Claiming lport 56fb9930-d77f-410f-bc18-3de929c7cd78 for this chassis.
Dec  3 09:32:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:42Z|00128|binding|INFO|56fb9930-d77f-410f-bc18-3de929c7cd78: Claiming fa:16:3e:2e:3b:cf 10.100.0.7
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.324 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.335 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:3b:cf 10.100.0.7'], port_security=['fa:16:3e:2e:3b:cf 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dcc96fb6-cdff-49c0-b6cc-2cdd526e4096', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=56fb9930-d77f-410f-bc18-3de929c7cd78) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.336 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 56fb9930-d77f-410f-bc18-3de929c7cd78 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.338 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:32:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:42Z|00129|binding|INFO|Setting lport 56fb9930-d77f-410f-bc18-3de929c7cd78 ovn-installed in OVS
Dec  3 09:32:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:42Z|00130|binding|INFO|Setting lport 56fb9930-d77f-410f-bc18-3de929c7cd78 up in Southbound
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.340 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.350 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aede1e25-2186-42eb-a69a-e0e9055d131c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.351 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.353 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.353 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[937a9286-c737-4d82-8708-8159b08b4c9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.354 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[065a2684-9eed-4b69-a1e9-3035e1ae2c8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 systemd-udevd[213768]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:32:42 np0005544118 systemd-machined[153602]: New machine qemu-11-instance-0000000e.
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.366 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[ffb5f4d1-e0d5-409d-ab14-67499e00e568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.3714] device (tap56fb9930-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:32:42 np0005544118 systemd[1]: Started Virtual Machine qemu-11-instance-0000000e.
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.3724] device (tap56fb9930-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.390 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[13547461-1cb5-4007-8fc7-0006705419bc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.418 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[3ef3c478-9fe0-4129-a3a3-28ba9636baf7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.425 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a6b264-c53a-46dd-8765-2ecaf7dd9a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.4261] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.454 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[3841a86a-d6e3-4868-a0c1-233101a9c90d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.457 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[b61ebd81-756e-4419-a9c5-1651bb414d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.4777] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.482 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[9c86a770-4ba3-442c-9f47-caa67ba4dbe8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.498 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f9967502-cf93-4d8d-ba5b-d1a7b1bb8dec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451808, 'reachable_time': 38427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213801, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.513 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[10776355-dc75-49a1-ac57-ccc654a3dd2e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451808, 'tstamp': 451808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213802, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.529 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb143b0-bbaa-460f-b3bf-e330d345789c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451808, 'reachable_time': 38427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213803, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.557 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbff5a6-ba50-4d68-b8a2-60d5de188a4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.611 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[757fbf6a-a759-4f53-b67a-a2edb70f546b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.613 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.613 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.613 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.616 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:32:42 np0005544118 NetworkManager[55710]: <info>  [1764772362.6186] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.619 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.620 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:42Z|00131|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.620 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.621 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.622 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1528fad4-c065-4517-b70c-6653527c8ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.622 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:32:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:32:42.623 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.719 187287 DEBUG nova.compute.manager [req-beb8097c-20eb-4c2b-a424-ff86098a7047 req-fb35f8b0-83c8-4375-824c-1f9f0a8e2b81 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.719 187287 DEBUG oslo_concurrency.lockutils [req-beb8097c-20eb-4c2b-a424-ff86098a7047 req-fb35f8b0-83c8-4375-824c-1f9f0a8e2b81 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.720 187287 DEBUG oslo_concurrency.lockutils [req-beb8097c-20eb-4c2b-a424-ff86098a7047 req-fb35f8b0-83c8-4375-824c-1f9f0a8e2b81 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.720 187287 DEBUG oslo_concurrency.lockutils [req-beb8097c-20eb-4c2b-a424-ff86098a7047 req-fb35f8b0-83c8-4375-824c-1f9f0a8e2b81 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.720 187287 DEBUG nova.compute.manager [req-beb8097c-20eb-4c2b-a424-ff86098a7047 req-fb35f8b0-83c8-4375-824c-1f9f0a8e2b81 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Processing event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.815 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772362.8149009, dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.815 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] VM Started (Lifecycle Event)#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.818 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.825 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.828 187287 INFO nova.virt.libvirt.driver [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Instance spawned successfully.#033[00m
Dec  3 09:32:42 np0005544118 nova_compute[187283]: 2025-12-03 14:32:42.829 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.016 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.019 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:32:43 np0005544118 podman[213842]: 2025-12-03 14:32:43.025948066 +0000 UTC m=+0.054599934 container create ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:32:43 np0005544118 systemd[1]: Started libpod-conmon-ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4.scope.
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.088 187287 DEBUG nova.network.neutron [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updated VIF entry in instance network info cache for port 56fb9930-d77f-410f-bc18-3de929c7cd78. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.089 187287 DEBUG nova.network.neutron [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updating instance_info_cache with network_info: [{"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:32:43 np0005544118 podman[213842]: 2025-12-03 14:32:42.994750085 +0000 UTC m=+0.023401953 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:32:43 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:32:43 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9cdc2498e363b9d30604c0eb2721b7e5e2f2bb57f04affa52fe3e053df7cf467/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:32:43 np0005544118 podman[213842]: 2025-12-03 14:32:43.112837952 +0000 UTC m=+0.141489830 container init ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:32:43 np0005544118 podman[213842]: 2025-12-03 14:32:43.118331247 +0000 UTC m=+0.146983115 container start ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:32:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [NOTICE]   (213862) : New worker (213864) forked
Dec  3 09:32:43 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [NOTICE]   (213862) : Loading success.
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.161 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.162 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.162 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.163 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.163 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.163 187287 DEBUG nova.virt.libvirt.driver [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.541 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.542 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772362.8150506, dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.542 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.829 187287 INFO nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Took 6.42 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.831 187287 DEBUG nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.843 187287 DEBUG oslo_concurrency.lockutils [req-55f21959-dac1-40c5-aba7-d3a00478f1e6 req-17586bb6-7fdd-450c-a445-6bd9cf715444 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.859 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.862 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772362.8211875, dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.862 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.888 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.891 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.898 187287 INFO nova.compute.manager [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Took 6.84 seconds to build instance.#033[00m
Dec  3 09:32:43 np0005544118 nova_compute[187283]: 2025-12-03 14:32:43.920 187287 DEBUG oslo_concurrency.lockutils [None req-edecbe92-e373-4c01-99d9-ceeb64572781 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.320 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.792 187287 DEBUG nova.compute.manager [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.792 187287 DEBUG oslo_concurrency.lockutils [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.793 187287 DEBUG oslo_concurrency.lockutils [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.793 187287 DEBUG oslo_concurrency.lockutils [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.793 187287 DEBUG nova.compute.manager [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] No waiting events found dispatching network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:32:44 np0005544118 nova_compute[187283]: 2025-12-03 14:32:44.793 187287 WARNING nova.compute.manager [req-bff0cfbd-5cab-489b-8998-42f2e1c39a52 req-eb58c4e5-2161-4da4-8569-f0d1aebbd333 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received unexpected event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:32:44 np0005544118 podman[213873]: 2025-12-03 14:32:44.815290048 +0000 UTC m=+0.050816187 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:32:46 np0005544118 nova_compute[187283]: 2025-12-03 14:32:46.626 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:47 np0005544118 podman[213890]: 2025-12-03 14:32:47.815324379 +0000 UTC m=+0.050486928 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:32:49 np0005544118 nova_compute[187283]: 2025-12-03 14:32:49.323 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:32:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:32:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:32:51 np0005544118 nova_compute[187283]: 2025-12-03 14:32:51.629 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:51 np0005544118 podman[213915]: 2025-12-03 14:32:51.859429006 +0000 UTC m=+0.088003108 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  3 09:32:54 np0005544118 nova_compute[187283]: 2025-12-03 14:32:54.323 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:56Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2e:3b:cf 10.100.0.7
Dec  3 09:32:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:32:56Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2e:3b:cf 10.100.0.7
Dec  3 09:32:56 np0005544118 nova_compute[187283]: 2025-12-03 14:32:56.664 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:32:59 np0005544118 nova_compute[187283]: 2025-12-03 14:32:59.325 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:00.965 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:00.966 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:00.967 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:01 np0005544118 nova_compute[187283]: 2025-12-03 14:33:01.668 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:01 np0005544118 podman[213959]: 2025-12-03 14:33:01.818510549 +0000 UTC m=+0.052599498 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec  3 09:33:04 np0005544118 nova_compute[187283]: 2025-12-03 14:33:04.328 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:05 np0005544118 podman[197639]: time="2025-12-03T14:33:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:33:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:33:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:33:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:33:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  3 09:33:05 np0005544118 podman[213981]: 2025-12-03 14:33:05.821197246 +0000 UTC m=+0.057150307 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd)
Dec  3 09:33:06 np0005544118 nova_compute[187283]: 2025-12-03 14:33:06.672 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:09 np0005544118 nova_compute[187283]: 2025-12-03 14:33:09.330 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:10 np0005544118 nova_compute[187283]: 2025-12-03 14:33:10.596 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Creating tmpfile /var/lib/nova/instances/tmpgu33bnu1 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:33:10 np0005544118 nova_compute[187283]: 2025-12-03 14:33:10.597 187287 DEBUG nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgu33bnu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:33:11 np0005544118 nova_compute[187283]: 2025-12-03 14:33:11.674 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:11 np0005544118 nova_compute[187283]: 2025-12-03 14:33:11.960 187287 DEBUG nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgu33bnu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4a0b588a-6d26-4dcc-a5c2-e058633ee4a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:33:12 np0005544118 nova_compute[187283]: 2025-12-03 14:33:12.028 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:33:12 np0005544118 nova_compute[187283]: 2025-12-03 14:33:12.029 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:33:12 np0005544118 nova_compute[187283]: 2025-12-03 14:33:12.029 187287 DEBUG nova.network.neutron [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:33:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:12Z|00132|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Dec  3 09:33:12 np0005544118 nova_compute[187283]: 2025-12-03 14:33:12.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.319 187287 DEBUG nova.network.neutron [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Updating instance_info_cache with network_info: [{"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.332 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.340 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.342 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgu33bnu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4a0b588a-6d26-4dcc-a5c2-e058633ee4a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.342 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Creating instance directory: /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.342 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Creating disk.info with the contents: {'/var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk': 'qcow2', '/var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.343 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.343 187287 DEBUG nova.objects.instance [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.376 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.437 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.438 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.439 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.450 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.503 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.504 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.539 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.540 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.541 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.600 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.601 187287 DEBUG nova.virt.disk.api [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.602 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.659 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.661 187287 DEBUG nova.virt.disk.api [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.661 187287 DEBUG nova.objects.instance [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.707 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.729 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config 485376" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.731 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config to /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:33:14 np0005544118 nova_compute[187283]: 2025-12-03 14:33:14.732 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.245 187287 DEBUG oslo_concurrency.processutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk.config /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.246 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.247 187287 DEBUG nova.virt.libvirt.vif [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-424205401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-424205401',id=13,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:32:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-vs6ajwvr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:32:30Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=4a0b588a-6d26-4dcc-a5c2-e058633ee4a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.248 187287 DEBUG nova.network.os_vif_util [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.248 187287 DEBUG nova.network.os_vif_util [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.249 187287 DEBUG os_vif [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.250 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.250 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.251 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.253 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.254 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b26815f-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.254 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b26815f-32, col_values=(('external_ids', {'iface-id': '4b26815f-329f-49fa-a7b6-54d16cc51e9c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:cc:26', 'vm-uuid': '4a0b588a-6d26-4dcc-a5c2-e058633ee4a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.256 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:15 np0005544118 NetworkManager[55710]: <info>  [1764772395.2569] manager: (tap4b26815f-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.259 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.264 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.265 187287 INFO os_vif [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32')#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.266 187287 DEBUG nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.266 187287 DEBUG nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgu33bnu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4a0b588a-6d26-4dcc-a5c2-e058633ee4a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:33:15 np0005544118 nova_compute[187283]: 2025-12-03 14:33:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:15 np0005544118 podman[214023]: 2025-12-03 14:33:15.816691069 +0000 UTC m=+0.050371845 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:33:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:16.488 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:33:16 np0005544118 nova_compute[187283]: 2025-12-03 14:33:16.488 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:16.490 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:33:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:16.491 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:16 np0005544118 nova_compute[187283]: 2025-12-03 14:33:16.892 187287 DEBUG nova.network.neutron [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Port 4b26815f-329f-49fa-a7b6-54d16cc51e9c updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:33:16 np0005544118 nova_compute[187283]: 2025-12-03 14:33:16.894 187287 DEBUG nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgu33bnu1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='4a0b588a-6d26-4dcc-a5c2-e058633ee4a3',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:33:17 np0005544118 kernel: tap4b26815f-32: entered promiscuous mode
Dec  3 09:33:17 np0005544118 NetworkManager[55710]: <info>  [1764772397.1575] manager: (tap4b26815f-32): new Tun device (/org/freedesktop/NetworkManager/Devices/57)
Dec  3 09:33:17 np0005544118 nova_compute[187283]: 2025-12-03 14:33:17.158 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:17Z|00133|binding|INFO|Claiming lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c for this additional chassis.
Dec  3 09:33:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:17Z|00134|binding|INFO|4b26815f-329f-49fa-a7b6-54d16cc51e9c: Claiming fa:16:3e:55:cc:26 10.100.0.14
Dec  3 09:33:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:17Z|00135|binding|INFO|Setting lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c ovn-installed in OVS
Dec  3 09:33:17 np0005544118 nova_compute[187283]: 2025-12-03 14:33:17.174 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:17 np0005544118 systemd-udevd[214057]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:33:17 np0005544118 systemd-machined[153602]: New machine qemu-12-instance-0000000d.
Dec  3 09:33:17 np0005544118 NetworkManager[55710]: <info>  [1764772397.1957] device (tap4b26815f-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:33:17 np0005544118 NetworkManager[55710]: <info>  [1764772397.1970] device (tap4b26815f-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:33:17 np0005544118 systemd[1]: Started Virtual Machine qemu-12-instance-0000000d.
Dec  3 09:33:17 np0005544118 nova_compute[187283]: 2025-12-03 14:33:17.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:17 np0005544118 nova_compute[187283]: 2025-12-03 14:33:17.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:33:17 np0005544118 nova_compute[187283]: 2025-12-03 14:33:17.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.033 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.033 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.033 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.034 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.490 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772398.4904885, 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.491 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] VM Started (Lifecycle Event)#033[00m
Dec  3 09:33:18 np0005544118 nova_compute[187283]: 2025-12-03 14:33:18.515 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:33:18 np0005544118 podman[214074]: 2025-12-03 14:33:18.826400692 +0000 UTC m=+0.056363124 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:33:19 np0005544118 nova_compute[187283]: 2025-12-03 14:33:19.335 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:33:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.077 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updating instance_info_cache with network_info: [{"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.257 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.413 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.413 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.414 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:20 np0005544118 nova_compute[187283]: 2025-12-03 14:33:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:21 np0005544118 nova_compute[187283]: 2025-12-03 14:33:21.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.578 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772402.5774672, 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.579 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.604 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.610 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.637 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.639 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.717 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.823 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.824 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:22 np0005544118 podman[214114]: 2025-12-03 14:33:22.836636062 +0000 UTC m=+0.145951945 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.896 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.901 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.961 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:22 np0005544118 nova_compute[187283]: 2025-12-03 14:33:22.962 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.051 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.233 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.235 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5548MB free_disk=73.27867126464844GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.235 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.235 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.280 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Migration for instance 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.300 187287 INFO nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Updating resource usage from migration 9b869d7b-25e7-4219-8041-bc809dcf0dff#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.301 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Starting to track incoming migration 9b869d7b-25e7-4219-8041-bc809dcf0dff with flavor ec610f84-c649-49d7-9c7a-a22befc31fb8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.418 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.446 187287 WARNING nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.447 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.447 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:33:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:23Z|00136|binding|INFO|Claiming lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c for this chassis.
Dec  3 09:33:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:23Z|00137|binding|INFO|4b26815f-329f-49fa-a7b6-54d16cc51e9c: Claiming fa:16:3e:55:cc:26 10.100.0.14
Dec  3 09:33:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:23Z|00138|binding|INFO|Setting lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c up in Southbound
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.514 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:cc:26 10.100.0.14'], port_security=['fa:16:3e:55:cc:26 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '4a0b588a-6d26-4dcc-a5c2-e058633ee4a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=4b26815f-329f-49fa-a7b6-54d16cc51e9c) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.516 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 4b26815f-329f-49fa-a7b6-54d16cc51e9c in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.517 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.538 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a4cd6c-a581-4d92-81b5-0f3cc22bcf5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.540 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.568 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.572 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c270f2-d7b1-4612-b0fa-8afec497a33e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.576 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[591d1d90-8ef2-4700-88db-d5f7b5600ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.608 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[beec9229-f257-4c33-af85-befb87fbf260]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.624 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.625 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.626 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4afcd2-4e6e-48ac-a826-af164642c3e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451808, 'reachable_time': 38427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214157, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.644 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[89940f79-01b7-4606-b342-3fe00254422e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451818, 'tstamp': 451818}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214158, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451821, 'tstamp': 451821}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214158, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.646 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.648 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.651 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.651 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.651 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:23.652 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:33:23 np0005544118 nova_compute[187283]: 2025-12-03 14:33:23.695 187287 INFO nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Post operation of migration started#033[00m
Dec  3 09:33:24 np0005544118 nova_compute[187283]: 2025-12-03 14:33:24.293 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:33:24 np0005544118 nova_compute[187283]: 2025-12-03 14:33:24.294 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:33:24 np0005544118 nova_compute[187283]: 2025-12-03 14:33:24.294 187287 DEBUG nova.network.neutron [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:33:24 np0005544118 nova_compute[187283]: 2025-12-03 14:33:24.336 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:25 np0005544118 nova_compute[187283]: 2025-12-03 14:33:25.261 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:25 np0005544118 nova_compute[187283]: 2025-12-03 14:33:25.627 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:25 np0005544118 nova_compute[187283]: 2025-12-03 14:33:25.628 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.464 187287 DEBUG nova.network.neutron [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Updating instance_info_cache with network_info: [{"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.491 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.513 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.513 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.514 187287 DEBUG oslo_concurrency.lockutils [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.520 187287 INFO nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:33:26 np0005544118 virtqemud[186958]: Domain id=12 name='instance-0000000d' uuid=4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 is tainted: custom-monitor
Dec  3 09:33:26 np0005544118 nova_compute[187283]: 2025-12-03 14:33:26.604 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:33:27 np0005544118 nova_compute[187283]: 2025-12-03 14:33:27.530 187287 INFO nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:33:28 np0005544118 nova_compute[187283]: 2025-12-03 14:33:28.537 187287 INFO nova.virt.libvirt.driver [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:33:28 np0005544118 nova_compute[187283]: 2025-12-03 14:33:28.541 187287 DEBUG nova.compute.manager [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:33:28 np0005544118 nova_compute[187283]: 2025-12-03 14:33:28.583 187287 DEBUG nova.objects.instance [None req-c5739a19-a90e-411d-a816-a6f0f87ce035 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:33:29 np0005544118 nova_compute[187283]: 2025-12-03 14:33:29.374 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:30 np0005544118 nova_compute[187283]: 2025-12-03 14:33:30.264 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:32 np0005544118 podman[214160]: 2025-12-03 14:33:32.838476615 +0000 UTC m=+0.062556679 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  3 09:33:34 np0005544118 nova_compute[187283]: 2025-12-03 14:33:34.433 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.061 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.061 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.062 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.062 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.063 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.064 187287 INFO nova.compute.manager [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Terminating instance#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.065 187287 DEBUG nova.compute.manager [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.266 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 kernel: tap56fb9930-d7 (unregistering): left promiscuous mode
Dec  3 09:33:35 np0005544118 NetworkManager[55710]: <info>  [1764772415.5104] device (tap56fb9930-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:33:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:35Z|00139|binding|INFO|Releasing lport 56fb9930-d77f-410f-bc18-3de929c7cd78 from this chassis (sb_readonly=0)
Dec  3 09:33:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:35Z|00140|binding|INFO|Setting lport 56fb9930-d77f-410f-bc18-3de929c7cd78 down in Southbound
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.524 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:35Z|00141|binding|INFO|Removing iface tap56fb9930-d7 ovn-installed in OVS
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.527 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.539 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:3b:cf 10.100.0.7'], port_security=['fa:16:3e:2e:3b:cf 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'dcc96fb6-cdff-49c0-b6cc-2cdd526e4096', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=56fb9930-d77f-410f-bc18-3de929c7cd78) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.543 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 56fb9930-d77f-410f-bc18-3de929c7cd78 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.545 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.546 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.564 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[758868ba-0d92-42b8-b937-ff67477ede3f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec  3 09:33:35 np0005544118 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000e.scope: Consumed 14.401s CPU time.
Dec  3 09:33:35 np0005544118 systemd-machined[153602]: Machine qemu-11-instance-0000000e terminated.
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.595 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[48e9918d-8bd6-4551-a62d-0fe4d66468d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.600 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[f870270e-ab86-4e0b-be7c-923114c2d899]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.623 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[5f81a680-3ed5-472f-a837-0dd0038b6ace]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 podman[197639]: time="2025-12-03T14:33:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.638 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c238c2ce-41f5-4b5b-9705-2dfbc5ba3549]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451808, 'reachable_time': 38427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214193, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:33:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:33:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:33:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3054 "" "Go-http-client/1.1"
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.657 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[389cda24-6ee0-4086-9847-24209d7a74c2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451818, 'tstamp': 451818}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214194, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 451821, 'tstamp': 451821}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214194, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.659 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.661 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.664 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.664 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.665 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.665 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:35.665 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.735 187287 INFO nova.virt.libvirt.driver [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Instance destroyed successfully.#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.736 187287 DEBUG nova.objects.instance [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.771 187287 DEBUG nova.virt.libvirt.vif [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:32:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1799678532',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1799678532',id=14,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:32:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-gchw7dba',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:32:43Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=dcc96fb6-cdff-49c0-b6cc-2cdd526e4096,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.772 187287 DEBUG nova.network.os_vif_util [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "56fb9930-d77f-410f-bc18-3de929c7cd78", "address": "fa:16:3e:2e:3b:cf", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56fb9930-d7", "ovs_interfaceid": "56fb9930-d77f-410f-bc18-3de929c7cd78", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.773 187287 DEBUG nova.network.os_vif_util [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.774 187287 DEBUG os_vif [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.777 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.778 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56fb9930-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.780 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.781 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.785 187287 INFO os_vif [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:3b:cf,bridge_name='br-int',has_traffic_filtering=True,id=56fb9930-d77f-410f-bc18-3de929c7cd78,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56fb9930-d7')#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.786 187287 INFO nova.virt.libvirt.driver [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Deleting instance files /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096_del#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.786 187287 INFO nova.virt.libvirt.driver [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Deletion of /var/lib/nova/instances/dcc96fb6-cdff-49c0-b6cc-2cdd526e4096_del complete#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.847 187287 INFO nova.compute.manager [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.848 187287 DEBUG oslo.service.loopingcall [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.848 187287 DEBUG nova.compute.manager [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:33:35 np0005544118 nova_compute[187283]: 2025-12-03 14:33:35.848 187287 DEBUG nova.network.neutron [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.227 187287 DEBUG nova.compute.manager [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-unplugged-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.227 187287 DEBUG oslo_concurrency.lockutils [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.227 187287 DEBUG oslo_concurrency.lockutils [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.228 187287 DEBUG oslo_concurrency.lockutils [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.228 187287 DEBUG nova.compute.manager [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] No waiting events found dispatching network-vif-unplugged-56fb9930-d77f-410f-bc18-3de929c7cd78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.228 187287 DEBUG nova.compute.manager [req-7a9b604b-fe88-4ec7-8767-71b57ecbd907 req-b5064c68-adbf-4cdc-a371-97e00ba23bfd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-unplugged-56fb9930-d77f-410f-bc18-3de929c7cd78 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.532 187287 DEBUG nova.network.neutron [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.556 187287 INFO nova.compute.manager [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Took 0.71 seconds to deallocate network for instance.#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.612 187287 DEBUG nova.compute.manager [req-3cc42880-e39a-495d-9879-102692559045 req-9fdb1112-bdf5-4f0c-a466-688d4e171c53 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-deleted-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.641 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.641 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.743 187287 DEBUG nova.compute.provider_tree [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.760 187287 DEBUG nova.scheduler.client.report [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.789 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.825 187287 INFO nova.scheduler.client.report [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance dcc96fb6-cdff-49c0-b6cc-2cdd526e4096#033[00m
Dec  3 09:33:36 np0005544118 podman[214213]: 2025-12-03 14:33:36.849493478 +0000 UTC m=+0.080540857 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:33:36 np0005544118 nova_compute[187283]: 2025-12-03 14:33:36.933 187287 DEBUG oslo_concurrency.lockutils [None req-b3a8d10a-92c5-43f5-af48-bf205cfcb736 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.872s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.449 187287 DEBUG nova.compute.manager [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.450 187287 DEBUG oslo_concurrency.lockutils [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.450 187287 DEBUG oslo_concurrency.lockutils [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.451 187287 DEBUG oslo_concurrency.lockutils [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "dcc96fb6-cdff-49c0-b6cc-2cdd526e4096-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.451 187287 DEBUG nova.compute.manager [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] No waiting events found dispatching network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.452 187287 WARNING nova.compute.manager [req-ce8c1eef-d8cf-49e5-9e3b-c53bdfa37646 req-4f769b7c-7f2c-4f64-97c0-1b34e7f9ba7b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Received unexpected event network-vif-plugged-56fb9930-d77f-410f-bc18-3de929c7cd78 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.487 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.487 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.488 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.488 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.488 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.490 187287 INFO nova.compute.manager [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Terminating instance#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.491 187287 DEBUG nova.compute.manager [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:33:38 np0005544118 kernel: tap4b26815f-32 (unregistering): left promiscuous mode
Dec  3 09:33:38 np0005544118 NetworkManager[55710]: <info>  [1764772418.6208] device (tap4b26815f-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.637 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:38Z|00142|binding|INFO|Releasing lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c from this chassis (sb_readonly=0)
Dec  3 09:33:38 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:38Z|00143|binding|INFO|Setting lport 4b26815f-329f-49fa-a7b6-54d16cc51e9c down in Southbound
Dec  3 09:33:38 np0005544118 ovn_controller[95637]: 2025-12-03T14:33:38Z|00144|binding|INFO|Removing iface tap4b26815f-32 ovn-installed in OVS
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.640 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.657 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec  3 09:33:38 np0005544118 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000d.scope: Consumed 2.457s CPU time.
Dec  3 09:33:38 np0005544118 systemd-machined[153602]: Machine qemu-12-instance-0000000d terminated.
Dec  3 09:33:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:38.693 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:cc:26 10.100.0.14'], port_security=['fa:16:3e:55:cc:26 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '4a0b588a-6d26-4dcc-a5c2-e058633ee4a3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=4b26815f-329f-49fa-a7b6-54d16cc51e9c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:33:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:38.694 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 4b26815f-329f-49fa-a7b6-54d16cc51e9c in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:33:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:38.695 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:33:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:38.696 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5776fbd7-427a-48b4-9b6f-0728fedd690f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:38.696 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.718 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.723 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.760 187287 INFO nova.virt.libvirt.driver [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Instance destroyed successfully.#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.761 187287 DEBUG nova.objects.instance [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.882 187287 DEBUG nova.virt.libvirt.vif [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:32:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-424205401',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-424205401',id=13,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:32:30Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-vs6ajwvr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:33:28Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=4a0b588a-6d26-4dcc-a5c2-e058633ee4a3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.882 187287 DEBUG nova.network.os_vif_util [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "address": "fa:16:3e:55:cc:26", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b26815f-32", "ovs_interfaceid": "4b26815f-329f-49fa-a7b6-54d16cc51e9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.883 187287 DEBUG nova.network.os_vif_util [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.884 187287 DEBUG os_vif [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.885 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.885 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b26815f-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.891 187287 INFO os_vif [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:55:cc:26,bridge_name='br-int',has_traffic_filtering=True,id=4b26815f-329f-49fa-a7b6-54d16cc51e9c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b26815f-32')#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.891 187287 INFO nova.virt.libvirt.driver [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Deleting instance files /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3_del#033[00m
Dec  3 09:33:38 np0005544118 nova_compute[187283]: 2025-12-03 14:33:38.892 187287 INFO nova.virt.libvirt.driver [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Deletion of /var/lib/nova/instances/4a0b588a-6d26-4dcc-a5c2-e058633ee4a3_del complete#033[00m
Dec  3 09:33:39 np0005544118 nova_compute[187283]: 2025-12-03 14:33:39.170 187287 INFO nova.compute.manager [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Took 0.68 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:33:39 np0005544118 nova_compute[187283]: 2025-12-03 14:33:39.171 187287 DEBUG oslo.service.loopingcall [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:33:39 np0005544118 nova_compute[187283]: 2025-12-03 14:33:39.171 187287 DEBUG nova.compute.manager [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:33:39 np0005544118 nova_compute[187283]: 2025-12-03 14:33:39.171 187287 DEBUG nova.network.neutron [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [NOTICE]   (213862) : haproxy version is 2.8.14-c23fe91
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [NOTICE]   (213862) : path to executable is /usr/sbin/haproxy
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [WARNING]  (213862) : Exiting Master process...
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [WARNING]  (213862) : Exiting Master process...
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [ALERT]    (213862) : Current worker (213864) exited with code 143 (Terminated)
Dec  3 09:33:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[213858]: [WARNING]  (213862) : All workers exited. Exiting... (0)
Dec  3 09:33:39 np0005544118 systemd[1]: libpod-ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4.scope: Deactivated successfully.
Dec  3 09:33:39 np0005544118 podman[214277]: 2025-12-03 14:33:39.292081144 +0000 UTC m=+0.479413911 container died ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec  3 09:33:39 np0005544118 nova_compute[187283]: 2025-12-03 14:33:39.435 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:39 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4-userdata-shm.mount: Deactivated successfully.
Dec  3 09:33:39 np0005544118 systemd[1]: var-lib-containers-storage-overlay-9cdc2498e363b9d30604c0eb2721b7e5e2f2bb57f04affa52fe3e053df7cf467-merged.mount: Deactivated successfully.
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.111 187287 DEBUG nova.network.neutron [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.137 187287 INFO nova.compute.manager [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Took 0.97 seconds to deallocate network for instance.#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.217 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.218 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.227 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.009s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.315 187287 DEBUG nova.compute.manager [req-d3975646-4410-4691-bc4e-09bcae8de761 req-bf57c602-96b3-418e-909e-9eefa4b9316d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Received event network-vif-deleted-4b26815f-329f-49fa-a7b6-54d16cc51e9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:40 np0005544118 podman[214277]: 2025-12-03 14:33:40.510203641 +0000 UTC m=+1.697536418 container cleanup ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.516 187287 INFO nova.scheduler.client.report [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3#033[00m
Dec  3 09:33:40 np0005544118 systemd[1]: libpod-conmon-ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4.scope: Deactivated successfully.
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.809 187287 DEBUG nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Received event network-vif-unplugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.810 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.811 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.811 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.812 187287 DEBUG nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] No waiting events found dispatching network-vif-unplugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.812 187287 WARNING nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Received unexpected event network-vif-unplugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.813 187287 DEBUG nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Received event network-vif-plugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.813 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.813 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.814 187287 DEBUG oslo_concurrency.lockutils [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.814 187287 DEBUG nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] No waiting events found dispatching network-vif-plugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:33:40 np0005544118 nova_compute[187283]: 2025-12-03 14:33:40.814 187287 WARNING nova.compute.manager [req-71a1121f-2e81-4256-b196-807a1dfd9ae9 req-179be53c-b2ed-45dc-9b7c-82f5393650f0 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Received unexpected event network-vif-plugged-4b26815f-329f-49fa-a7b6-54d16cc51e9c for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:33:41 np0005544118 podman[214308]: 2025-12-03 14:33:41.039571222 +0000 UTC m=+0.506233359 container remove ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.046 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef13ff2-d822-4da8-bd94-691e96168506]: (4, ('Wed Dec  3 02:33:38 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4)\nac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4\nWed Dec  3 02:33:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (ac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4)\nac14ce0609e2113e2f5190cd15a30e879d510d8e40ac712234e6e31320ae76e4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.047 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a07d578c-886b-4c2a-922a-47ee3399a463]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.048 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:33:41 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:33:41 np0005544118 nova_compute[187283]: 2025-12-03 14:33:41.051 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:41 np0005544118 nova_compute[187283]: 2025-12-03 14:33:41.064 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.067 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe74f6b-b1ce-44d3-98df-a9450f2c59e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.082 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0797bd8f-f46f-47ef-a789-830ee0860aae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.083 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fd18acb9-2acf-449c-9acc-26e22450e81a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.096 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[78f9e58e-d38b-4c0b-9609-1fcea1401992]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 451802, 'reachable_time': 41033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214324, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.104 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:33:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:33:41.104 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[520d9746-de95-49f1-99aa-1725827bed4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:33:41 np0005544118 nova_compute[187283]: 2025-12-03 14:33:41.718 187287 DEBUG oslo_concurrency.lockutils [None req-8523504d-6a7e-4cc4-be08-c2c74a35e12f 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "4a0b588a-6d26-4dcc-a5c2-e058633ee4a3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:33:43 np0005544118 nova_compute[187283]: 2025-12-03 14:33:43.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:44 np0005544118 nova_compute[187283]: 2025-12-03 14:33:44.437 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:46 np0005544118 podman[214326]: 2025-12-03 14:33:46.843549648 +0000 UTC m=+0.075926256 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:33:48 np0005544118 nova_compute[187283]: 2025-12-03 14:33:48.890 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:33:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:33:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:33:49 np0005544118 nova_compute[187283]: 2025-12-03 14:33:49.438 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:49 np0005544118 podman[214345]: 2025-12-03 14:33:49.815800033 +0000 UTC m=+0.047696148 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:33:50 np0005544118 nova_compute[187283]: 2025-12-03 14:33:50.733 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772415.7314367, dcc96fb6-cdff-49c0-b6cc-2cdd526e4096 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:33:50 np0005544118 nova_compute[187283]: 2025-12-03 14:33:50.734 187287 INFO nova.compute.manager [-] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:33:50 np0005544118 nova_compute[187283]: 2025-12-03 14:33:50.807 187287 DEBUG nova.compute.manager [None req-9a74ddf7-1c03-4226-a62a-dbab04855d94 - - - - - -] [instance: dcc96fb6-cdff-49c0-b6cc-2cdd526e4096] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:33:53 np0005544118 nova_compute[187283]: 2025-12-03 14:33:53.759 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772418.7580209, 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:33:53 np0005544118 nova_compute[187283]: 2025-12-03 14:33:53.759 187287 INFO nova.compute.manager [-] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:33:53 np0005544118 nova_compute[187283]: 2025-12-03 14:33:53.791 187287 DEBUG nova.compute.manager [None req-2e81d4ca-b090-4e4c-b8db-1ebd2126619c - - - - - -] [instance: 4a0b588a-6d26-4dcc-a5c2-e058633ee4a3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:33:53 np0005544118 podman[214371]: 2025-12-03 14:33:53.885603578 +0000 UTC m=+0.121154276 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:33:53 np0005544118 nova_compute[187283]: 2025-12-03 14:33:53.892 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:54 np0005544118 nova_compute[187283]: 2025-12-03 14:33:54.479 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:58 np0005544118 nova_compute[187283]: 2025-12-03 14:33:58.935 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:33:59 np0005544118 nova_compute[187283]: 2025-12-03 14:33:59.482 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:00.966 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:00.967 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:00.967 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:03 np0005544118 podman[214396]: 2025-12-03 14:34:03.829502731 +0000 UTC m=+0.057285360 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, version=9.6)
Dec  3 09:34:03 np0005544118 nova_compute[187283]: 2025-12-03 14:34:03.937 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:04 np0005544118 nova_compute[187283]: 2025-12-03 14:34:04.535 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:05 np0005544118 podman[197639]: time="2025-12-03T14:34:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:34:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:34:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:34:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:34:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec  3 09:34:07 np0005544118 podman[214418]: 2025-12-03 14:34:07.822155934 +0000 UTC m=+0.052044082 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:34:08 np0005544118 nova_compute[187283]: 2025-12-03 14:34:08.939 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:09 np0005544118 nova_compute[187283]: 2025-12-03 14:34:09.536 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:10 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:10Z|00145|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:34:13 np0005544118 nova_compute[187283]: 2025-12-03 14:34:13.941 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:14 np0005544118 nova_compute[187283]: 2025-12-03 14:34:14.583 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:14 np0005544118 nova_compute[187283]: 2025-12-03 14:34:14.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:16 np0005544118 nova_compute[187283]: 2025-12-03 14:34:16.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:17 np0005544118 nova_compute[187283]: 2025-12-03 14:34:17.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:17 np0005544118 nova_compute[187283]: 2025-12-03 14:34:17.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:34:17 np0005544118 nova_compute[187283]: 2025-12-03 14:34:17.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:34:17 np0005544118 nova_compute[187283]: 2025-12-03 14:34:17.633 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:34:17 np0005544118 nova_compute[187283]: 2025-12-03 14:34:17.634 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:17 np0005544118 podman[214439]: 2025-12-03 14:34:17.808300912 +0000 UTC m=+0.046528736 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  3 09:34:18 np0005544118 nova_compute[187283]: 2025-12-03 14:34:18.943 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:34:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:34:19 np0005544118 nova_compute[187283]: 2025-12-03 14:34:19.585 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:20.524 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:34:20 np0005544118 nova_compute[187283]: 2025-12-03 14:34:20.523 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:20.524 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:34:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:20.525 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:20 np0005544118 nova_compute[187283]: 2025-12-03 14:34:20.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:20 np0005544118 podman[214459]: 2025-12-03 14:34:20.855785643 +0000 UTC m=+0.077958203 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:34:21 np0005544118 nova_compute[187283]: 2025-12-03 14:34:21.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:23 np0005544118 nova_compute[187283]: 2025-12-03 14:34:23.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:23 np0005544118 nova_compute[187283]: 2025-12-03 14:34:23.945 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.586 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.657 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.657 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.657 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.657 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:34:24 np0005544118 podman[214484]: 2025-12-03 14:34:24.80606314 +0000 UTC m=+0.103303271 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.830 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.831 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.33635330200195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.832 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.832 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.962 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.963 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.979 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.997 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:34:24 np0005544118 nova_compute[187283]: 2025-12-03 14:34:24.998 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.012 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.043 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.070 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.086 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.113 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:34:25 np0005544118 nova_compute[187283]: 2025-12-03 14:34:25.113 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.281s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:26 np0005544118 nova_compute[187283]: 2025-12-03 14:34:26.113 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:34:26 np0005544118 nova_compute[187283]: 2025-12-03 14:34:26.114 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:34:28 np0005544118 nova_compute[187283]: 2025-12-03 14:34:28.947 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:29 np0005544118 nova_compute[187283]: 2025-12-03 14:34:29.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.055 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.056 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.300 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.799 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.800 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.806 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.806 187287 INFO nova.compute.claims [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:34:33 np0005544118 nova_compute[187283]: 2025-12-03 14:34:33.949 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.013 187287 DEBUG nova.compute.provider_tree [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.046 187287 DEBUG nova.scheduler.client.report [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.098 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.098 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.207 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.207 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.255 187287 INFO nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.316 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.459 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.461 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.461 187287 INFO nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Creating image(s)#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.462 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.462 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.463 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.480 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.538 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.539 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.540 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.551 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.611 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.612 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.671 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.692 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk 1073741824" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.693 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.693 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.751 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.752 187287 DEBUG nova.virt.disk.api [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.752 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.809 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.810 187287 DEBUG nova.virt.disk.api [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.811 187287 DEBUG nova.objects.instance [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid 801daf39-165b-40fd-85ba-84d4007360c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:34:34 np0005544118 podman[214521]: 2025-12-03 14:34:34.816470124 +0000 UTC m=+0.051420825 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.826 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.826 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Ensure instance console log exists: /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.827 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.827 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:34 np0005544118 nova_compute[187283]: 2025-12-03 14:34:34.827 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:35 np0005544118 nova_compute[187283]: 2025-12-03 14:34:35.134 187287 DEBUG nova.policy [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:34:35 np0005544118 podman[197639]: time="2025-12-03T14:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:34:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:34:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:34:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:34:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2590 "" "Go-http-client/1.1"
Dec  3 09:34:36 np0005544118 nova_compute[187283]: 2025-12-03 14:34:36.891 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Successfully created port: a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.521 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Successfully updated port: a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.548 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.549 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.549 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.631 187287 DEBUG nova.compute.manager [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-changed-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.631 187287 DEBUG nova.compute.manager [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Refreshing instance network info cache due to event network-changed-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.632 187287 DEBUG oslo_concurrency.lockutils [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.722 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:34:38 np0005544118 podman[214547]: 2025-12-03 14:34:38.820739397 +0000 UTC m=+0.057418804 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec  3 09:34:38 np0005544118 nova_compute[187283]: 2025-12-03 14:34:38.978 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:39 np0005544118 nova_compute[187283]: 2025-12-03 14:34:39.672 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.142 187287 DEBUG nova.network.neutron [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updating instance_info_cache with network_info: [{"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.337 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.338 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Instance network_info: |[{"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.338 187287 DEBUG oslo_concurrency.lockutils [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.338 187287 DEBUG nova.network.neutron [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Refreshing network info cache for port a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.341 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Start _get_guest_xml network_info=[{"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.345 187287 WARNING nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.349 187287 DEBUG nova.virt.libvirt.host [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.350 187287 DEBUG nova.virt.libvirt.host [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.353 187287 DEBUG nova.virt.libvirt.host [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.353 187287 DEBUG nova.virt.libvirt.host [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.355 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.355 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.355 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.356 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.356 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.356 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.356 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.356 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.357 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.357 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.357 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.357 187287 DEBUG nova.virt.hardware [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.361 187287 DEBUG nova.virt.libvirt.vif [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1921420340',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1921420340',id=16,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-tl5u3j0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:34:34Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=801daf39-165b-40fd-85ba-84d4007360c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.361 187287 DEBUG nova.network.os_vif_util [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.362 187287 DEBUG nova.network.os_vif_util [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.363 187287 DEBUG nova.objects.instance [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid 801daf39-165b-40fd-85ba-84d4007360c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.395 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <uuid>801daf39-165b-40fd-85ba-84d4007360c1</uuid>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <name>instance-00000010</name>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-1921420340</nova:name>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:34:40</nova:creationTime>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        <nova:port uuid="a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="serial">801daf39-165b-40fd-85ba-84d4007360c1</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="uuid">801daf39-165b-40fd-85ba-84d4007360c1</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.config"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:78:95:0d"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <target dev="tapa8cf6de3-f4"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/console.log" append="off"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:34:40 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:34:40 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:34:40 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:34:40 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.397 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Preparing to wait for external event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.397 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.398 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.398 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.400 187287 DEBUG nova.virt.libvirt.vif [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1921420340',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1921420340',id=16,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-tl5u3j0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:34:34Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=801daf39-165b-40fd-85ba-84d4007360c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.400 187287 DEBUG nova.network.os_vif_util [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.401 187287 DEBUG nova.network.os_vif_util [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.402 187287 DEBUG os_vif [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.403 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.403 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.404 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.407 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.407 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8cf6de3-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.408 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8cf6de3-f4, col_values=(('external_ids', {'iface-id': 'a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:95:0d', 'vm-uuid': '801daf39-165b-40fd-85ba-84d4007360c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.409 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:40 np0005544118 NetworkManager[55710]: <info>  [1764772480.4107] manager: (tapa8cf6de3-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.412 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.417 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.418 187287 INFO os_vif [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4')#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.782 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.782 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.783 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:78:95:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:34:40 np0005544118 nova_compute[187283]: 2025-12-03 14:34:40.783 187287 INFO nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Using config drive#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.212 187287 INFO nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Creating config drive at /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.config#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.219 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptoq55b2x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.342 187287 DEBUG oslo_concurrency.processutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptoq55b2x" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:34:41 np0005544118 kernel: tapa8cf6de3-f4: entered promiscuous mode
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.4044] manager: (tapa8cf6de3-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Dec  3 09:34:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:41Z|00146|binding|INFO|Claiming lport a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c for this chassis.
Dec  3 09:34:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:41Z|00147|binding|INFO|a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c: Claiming fa:16:3e:78:95:0d 10.100.0.3
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.406 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:41Z|00148|binding|INFO|Setting lport a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c ovn-installed in OVS
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.418 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.420 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 systemd-udevd[214585]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:34:41 np0005544118 systemd-machined[153602]: New machine qemu-13-instance-00000010.
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.4451] device (tapa8cf6de3-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.4467] device (tapa8cf6de3-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:34:41 np0005544118 systemd[1]: Started Virtual Machine qemu-13-instance-00000010.
Dec  3 09:34:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:41Z|00149|binding|INFO|Setting lport a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c up in Southbound
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.480 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:95:0d 10.100.0.3'], port_security=['fa:16:3e:78:95:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '801daf39-165b-40fd-85ba-84d4007360c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.481 104491 INFO neutron.agent.ovn.metadata.agent [-] Port a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.482 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.494 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[562740c0-88c0-4c25-80d6-6ef0b425da7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.495 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.497 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.497 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac323c6-c8e7-4e29-b88c-7b6a1055a9dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.498 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[86ccf53d-a260-4400-ad85-81714de345c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.510 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[9f90d650-3945-421f-9312-098c6737a53c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.525 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd37a22-4790-46a0-a558-eaa8e1c21c2d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.552 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[2d283eeb-7108-41bd-b445-58439a18d443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 systemd-udevd[214588]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.5603] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.560 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8d00f7ec-aca8-40a3-b50d-26486aee241d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.591 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[f80cd8b0-05da-4c4e-a90c-b96843c35b5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.594 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0d78d3ab-9b74-4c3f-8146-2175f9e7bf0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.6157] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.621 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[816ce995-048e-47b2-b04e-63e95653ae86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.637 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[816b209f-b39d-4c21-8516-f843d0d3ff49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463722, 'reachable_time': 38261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214621, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.653 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cc75d570-143e-4e71-b591-f2f6d1ad24bb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463722, 'tstamp': 463722}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214626, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.672 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[062c6b78-533b-40be-b337-e58bbcd573ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463722, 'reachable_time': 38261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214627, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.708 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7ddef5-0dd4-4505-9680-3c20c937bd22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.735 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772481.7349417, 801daf39-165b-40fd-85ba-84d4007360c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.735 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] VM Started (Lifecycle Event)#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.780 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[825a8500-6f06-42a1-828e-632cc7abfd0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.782 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.783 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.783 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.785 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:34:41 np0005544118 NetworkManager[55710]: <info>  [1764772481.7864] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.787 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.788 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.790 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:41Z|00150|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.802 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.803 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.804 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.805 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[91766c86-c56e-4f97-aff1-d347b78cd82b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.806 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:34:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:34:41.806 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.966 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.971 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772481.7356522, 801daf39-165b-40fd-85ba-84d4007360c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.971 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.996 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:34:41 np0005544118 nova_compute[187283]: 2025-12-03 14:34:41.999 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.030 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.153 187287 DEBUG nova.network.neutron [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updated VIF entry in instance network info cache for port a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.154 187287 DEBUG nova.network.neutron [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updating instance_info_cache with network_info: [{"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:34:42 np0005544118 podman[214659]: 2025-12-03 14:34:42.172864967 +0000 UTC m=+0.044147998 container create 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  3 09:34:42 np0005544118 systemd[1]: Started libpod-conmon-29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747.scope.
Dec  3 09:34:42 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:34:42 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f20bd18ab1fa9f58043381d88ee3b78ce0ad54394a4f7d8528eaefb271af6698/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:34:42 np0005544118 podman[214659]: 2025-12-03 14:34:42.148799527 +0000 UTC m=+0.020082578 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.248 187287 DEBUG oslo_concurrency.lockutils [req-0fc5cd1b-822b-4eef-9640-e88a40b14489 req-bcc308f1-8ec5-4a90-8e69-e931f671c7e9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:34:42 np0005544118 podman[214659]: 2025-12-03 14:34:42.259939248 +0000 UTC m=+0.131222319 container init 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:34:42 np0005544118 podman[214659]: 2025-12-03 14:34:42.267882783 +0000 UTC m=+0.139165854 container start 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  3 09:34:42 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [NOTICE]   (214679) : New worker (214681) forked
Dec  3 09:34:42 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [NOTICE]   (214679) : Loading success.
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.300 187287 DEBUG nova.compute.manager [req-ebac6286-aed5-4534-8cce-737276bd6f5a req-7a2a321c-f977-424c-890e-f994427f5ad5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.301 187287 DEBUG oslo_concurrency.lockutils [req-ebac6286-aed5-4534-8cce-737276bd6f5a req-7a2a321c-f977-424c-890e-f994427f5ad5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.301 187287 DEBUG oslo_concurrency.lockutils [req-ebac6286-aed5-4534-8cce-737276bd6f5a req-7a2a321c-f977-424c-890e-f994427f5ad5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.302 187287 DEBUG oslo_concurrency.lockutils [req-ebac6286-aed5-4534-8cce-737276bd6f5a req-7a2a321c-f977-424c-890e-f994427f5ad5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.302 187287 DEBUG nova.compute.manager [req-ebac6286-aed5-4534-8cce-737276bd6f5a req-7a2a321c-f977-424c-890e-f994427f5ad5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Processing event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.303 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.307 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772482.3071294, 801daf39-165b-40fd-85ba-84d4007360c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.308 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.310 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.316 187287 INFO nova.virt.libvirt.driver [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Instance spawned successfully.#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.317 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.563 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.571 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.576 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.577 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.578 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.578 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.579 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.580 187287 DEBUG nova.virt.libvirt.driver [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.708 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.942 187287 INFO nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Took 8.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:34:42 np0005544118 nova_compute[187283]: 2025-12-03 14:34:42.943 187287 DEBUG nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:34:43 np0005544118 nova_compute[187283]: 2025-12-03 14:34:43.313 187287 INFO nova.compute.manager [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Took 9.54 seconds to build instance.#033[00m
Dec  3 09:34:43 np0005544118 nova_compute[187283]: 2025-12-03 14:34:43.352 187287 DEBUG oslo_concurrency.lockutils [None req-cd6b016f-2ab4-47ff-83a2-f4f945d42bca 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.469 187287 DEBUG nova.compute.manager [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.470 187287 DEBUG oslo_concurrency.lockutils [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.470 187287 DEBUG oslo_concurrency.lockutils [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.471 187287 DEBUG oslo_concurrency.lockutils [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.471 187287 DEBUG nova.compute.manager [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] No waiting events found dispatching network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.471 187287 WARNING nova.compute.manager [req-4855347b-3479-4b05-a748-6c0c13d65ef6 req-d756bc59-0be6-4f72-8460-54c3aa19f6ea c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received unexpected event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c for instance with vm_state active and task_state None.#033[00m
Dec  3 09:34:44 np0005544118 nova_compute[187283]: 2025-12-03 14:34:44.723 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:45 np0005544118 nova_compute[187283]: 2025-12-03 14:34:45.410 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:48 np0005544118 podman[214690]: 2025-12-03 14:34:48.824014088 +0000 UTC m=+0.053325758 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:34:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:34:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:34:49 np0005544118 nova_compute[187283]: 2025-12-03 14:34:49.724 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:50 np0005544118 nova_compute[187283]: 2025-12-03 14:34:50.412 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:51 np0005544118 podman[214710]: 2025-12-03 14:34:51.810256718 +0000 UTC m=+0.044645283 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:34:54 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:54Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:95:0d 10.100.0.3
Dec  3 09:34:54 np0005544118 ovn_controller[95637]: 2025-12-03T14:34:54Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:95:0d 10.100.0.3
Dec  3 09:34:54 np0005544118 nova_compute[187283]: 2025-12-03 14:34:54.727 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:55 np0005544118 nova_compute[187283]: 2025-12-03 14:34:55.426 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:34:55 np0005544118 podman[214750]: 2025-12-03 14:34:55.854580973 +0000 UTC m=+0.084743567 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  3 09:34:59 np0005544118 nova_compute[187283]: 2025-12-03 14:34:59.729 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:00 np0005544118 nova_compute[187283]: 2025-12-03 14:35:00.429 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:00.967 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:00.968 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:00.969 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:04 np0005544118 nova_compute[187283]: 2025-12-03 14:35:04.731 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:05 np0005544118 nova_compute[187283]: 2025-12-03 14:35:05.432 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:05 np0005544118 podman[197639]: time="2025-12-03T14:35:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:35:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:35:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:35:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:35:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3059 "" "Go-http-client/1.1"
Dec  3 09:35:05 np0005544118 podman[214776]: 2025-12-03 14:35:05.828231718 +0000 UTC m=+0.059871114 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Dec  3 09:35:09 np0005544118 nova_compute[187283]: 2025-12-03 14:35:09.733 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:09 np0005544118 podman[214800]: 2025-12-03 14:35:09.859488112 +0000 UTC m=+0.096557320 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125)
Dec  3 09:35:10 np0005544118 nova_compute[187283]: 2025-12-03 14:35:10.435 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:11 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:11Z|00151|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Dec  3 09:35:11 np0005544118 nova_compute[187283]: 2025-12-03 14:35:11.634 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Creating tmpfile /var/lib/nova/instances/tmpv6n7_le2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:35:11 np0005544118 nova_compute[187283]: 2025-12-03 14:35:11.750 187287 DEBUG nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6n7_le2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:35:13 np0005544118 nova_compute[187283]: 2025-12-03 14:35:13.984 187287 DEBUG nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6n7_le2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:35:14 np0005544118 nova_compute[187283]: 2025-12-03 14:35:14.012 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:35:14 np0005544118 nova_compute[187283]: 2025-12-03 14:35:14.013 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:35:14 np0005544118 nova_compute[187283]: 2025-12-03 14:35:14.013 187287 DEBUG nova.network.neutron [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:35:14 np0005544118 nova_compute[187283]: 2025-12-03 14:35:14.736 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.437 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.917 187287 DEBUG nova.network.neutron [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Updating instance_info_cache with network_info: [{"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.944 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.946 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6n7_le2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.947 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Creating instance directory: /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.947 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Creating disk.info with the contents: {'/var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk': 'qcow2', '/var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.948 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.948 187287 DEBUG nova.objects.instance [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:35:15 np0005544118 nova_compute[187283]: 2025-12-03 14:35:15.979 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.042 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.043 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.044 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.055 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.117 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.118 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.150 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk 1073741824" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.151 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.151 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.206 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.208 187287 DEBUG nova.virt.disk.api [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.208 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.280 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.281 187287 DEBUG nova.virt.disk.api [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.282 187287 DEBUG nova.objects.instance [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.305 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.332 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config 485376" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.335 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config to /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.335 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.869 187287 DEBUG oslo_concurrency.processutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk.config /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.870 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.872 187287 DEBUG nova.virt.libvirt.vif [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:34:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1441281371',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1441281371',id=15,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:34:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-onhwgnyu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:34:26Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.872 187287 DEBUG nova.network.os_vif_util [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.873 187287 DEBUG nova.network.os_vif_util [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.874 187287 DEBUG os_vif [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.875 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.875 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.876 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.879 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.880 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc30c508a-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.880 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc30c508a-97, col_values=(('external_ids', {'iface-id': 'c30c508a-97f3-4d74-96e9-967b4d03ed1a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:ec:05', 'vm-uuid': '0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.882 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:16 np0005544118 NetworkManager[55710]: <info>  [1764772516.8849] manager: (tapc30c508a-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.885 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.889 187287 INFO os_vif [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97')#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.890 187287 DEBUG nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:35:16 np0005544118 nova_compute[187283]: 2025-12-03 14:35:16.890 187287 DEBUG nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6n7_le2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:35:17 np0005544118 nova_compute[187283]: 2025-12-03 14:35:17.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:18.249 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.250 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:18 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:18.251 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.745 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.746 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.746 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.746 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 801daf39-165b-40fd-85ba-84d4007360c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.957 187287 DEBUG nova.network.neutron [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Port c30c508a-97f3-4d74-96e9-967b4d03ed1a updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:35:18 np0005544118 nova_compute[187283]: 2025-12-03 14:35:18.959 187287 DEBUG nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpv6n7_le2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:35:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:35:19 np0005544118 kernel: tapc30c508a-97: entered promiscuous mode
Dec  3 09:35:19 np0005544118 NetworkManager[55710]: <info>  [1764772519.6191] manager: (tapc30c508a-97): new Tun device (/org/freedesktop/NetworkManager/Devices/63)
Dec  3 09:35:19 np0005544118 nova_compute[187283]: 2025-12-03 14:35:19.621 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:19Z|00152|binding|INFO|Claiming lport c30c508a-97f3-4d74-96e9-967b4d03ed1a for this additional chassis.
Dec  3 09:35:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:19Z|00153|binding|INFO|c30c508a-97f3-4d74-96e9-967b4d03ed1a: Claiming fa:16:3e:6a:ec:05 10.100.0.11
Dec  3 09:35:19 np0005544118 podman[214842]: 2025-12-03 14:35:19.631419196 +0000 UTC m=+0.056070646 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  3 09:35:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:19Z|00154|binding|INFO|Setting lport c30c508a-97f3-4d74-96e9-967b4d03ed1a ovn-installed in OVS
Dec  3 09:35:19 np0005544118 nova_compute[187283]: 2025-12-03 14:35:19.642 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:19 np0005544118 nova_compute[187283]: 2025-12-03 14:35:19.643 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:19 np0005544118 systemd-udevd[214871]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:35:19 np0005544118 NetworkManager[55710]: <info>  [1764772519.6626] device (tapc30c508a-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:35:19 np0005544118 NetworkManager[55710]: <info>  [1764772519.6633] device (tapc30c508a-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:35:19 np0005544118 systemd-machined[153602]: New machine qemu-14-instance-0000000f.
Dec  3 09:35:19 np0005544118 systemd[1]: Started Virtual Machine qemu-14-instance-0000000f.
Dec  3 09:35:19 np0005544118 nova_compute[187283]: 2025-12-03 14:35:19.801 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:20 np0005544118 nova_compute[187283]: 2025-12-03 14:35:20.671 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updating instance_info_cache with network_info: [{"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:35:20 np0005544118 nova_compute[187283]: 2025-12-03 14:35:20.698 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-801daf39-165b-40fd-85ba-84d4007360c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:35:20 np0005544118 nova_compute[187283]: 2025-12-03 14:35:20.699 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:35:20 np0005544118 nova_compute[187283]: 2025-12-03 14:35:20.699 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:20 np0005544118 nova_compute[187283]: 2025-12-03 14:35:20.699 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:21 np0005544118 nova_compute[187283]: 2025-12-03 14:35:21.179 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772521.1787868, 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:35:21 np0005544118 nova_compute[187283]: 2025-12-03 14:35:21.179 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] VM Started (Lifecycle Event)#033[00m
Dec  3 09:35:21 np0005544118 nova_compute[187283]: 2025-12-03 14:35:21.201 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:35:21 np0005544118 nova_compute[187283]: 2025-12-03 14:35:21.882 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.580 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772522.5800252, 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.581 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.620 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.623 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:35:22 np0005544118 nova_compute[187283]: 2025-12-03 14:35:22.652 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:35:22 np0005544118 podman[214899]: 2025-12-03 14:35:22.8155565 +0000 UTC m=+0.050669074 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:35:23 np0005544118 nova_compute[187283]: 2025-12-03 14:35:23.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:23Z|00155|binding|INFO|Claiming lport c30c508a-97f3-4d74-96e9-967b4d03ed1a for this chassis.
Dec  3 09:35:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:23Z|00156|binding|INFO|c30c508a-97f3-4d74-96e9-967b4d03ed1a: Claiming fa:16:3e:6a:ec:05 10.100.0.11
Dec  3 09:35:23 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:23Z|00157|binding|INFO|Setting lport c30c508a-97f3-4d74-96e9-967b4d03ed1a up in Southbound
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.689 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:ec:05 10.100.0.11'], port_security=['fa:16:3e:6a:ec:05 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=c30c508a-97f3-4d74-96e9-967b4d03ed1a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.691 104491 INFO neutron.agent.ovn.metadata.agent [-] Port c30c508a-97f3-4d74-96e9-967b4d03ed1a in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.692 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.708 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fed5c987-5431-4f23-82df-354c92df05f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.741 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc41e47-0222-4fc5-86bb-9aa35daec93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.745 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2cadb6-92cd-4658-b4ba-cf5fe8a8bd59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.773 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[7d1a1efd-54cb-40f2-b09c-306fbb99930b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.796 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9b696833-e189-4f93-a4a5-a80fddf53b52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463722, 'reachable_time': 38261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214929, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.813 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5f54ad8e-cc46-4849-a71a-f3ab7c72013d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463734, 'tstamp': 463734}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214930, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463738, 'tstamp': 463738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214930, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.816 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:23 np0005544118 nova_compute[187283]: 2025-12-03 14:35:23.818 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.819 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.819 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.820 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:23.820 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:35:23 np0005544118 nova_compute[187283]: 2025-12-03 14:35:23.932 187287 INFO nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Post operation of migration started#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.194 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.195 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.195 187287 DEBUG nova.network.neutron [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.631 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.699 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.792 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.795 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.815 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.865 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.872 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.932 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.933 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:35:24 np0005544118 nova_compute[187283]: 2025-12-03 14:35:24.987 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:35:25 np0005544118 nova_compute[187283]: 2025-12-03 14:35:25.136 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:35:25 np0005544118 nova_compute[187283]: 2025-12-03 14:35:25.137 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5553MB free_disk=73.27860641479492GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:35:25 np0005544118 nova_compute[187283]: 2025-12-03 14:35:25.138 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:25 np0005544118 nova_compute[187283]: 2025-12-03 14:35:25.138 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:25 np0005544118 nova_compute[187283]: 2025-12-03 14:35:25.162 187287 DEBUG nova.network.neutron [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Updating instance_info_cache with network_info: [{"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.022 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.045 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Migration for instance 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.047 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.067 187287 INFO nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Updating resource usage from migration b0ab26b8-0746-4eec-8139-e8e6f22c25fb#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.067 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Starting to track incoming migration b0ab26b8-0746-4eec-8139-e8e6f22c25fb with flavor ec610f84-c649-49d7-9c7a-a22befc31fb8 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.110 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 801daf39-165b-40fd-85ba-84d4007360c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.128 187287 WARNING nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 has been moved to another host compute-0.ctlplane.example.com(compute-0.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.128 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.129 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.183 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.200 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.221 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.222 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.222 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.223 187287 DEBUG oslo_concurrency.lockutils [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.228 187287 INFO nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:35:26 np0005544118 virtqemud[186958]: Domain id=14 name='instance-0000000f' uuid=0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 is tainted: custom-monitor
Dec  3 09:35:26 np0005544118 nova_compute[187283]: 2025-12-03 14:35:26.883 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:26 np0005544118 podman[214944]: 2025-12-03 14:35:26.88842472 +0000 UTC m=+0.115721201 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:35:27 np0005544118 nova_compute[187283]: 2025-12-03 14:35:27.234 187287 INFO nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.219 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.240 187287 INFO nova.virt.libvirt.driver [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.244 187287 DEBUG nova.compute.manager [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:35:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:28.253 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.421 187287 DEBUG nova.objects.instance [None req-c68ea71b-c201-46a3-9009-a532a4459b5a b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.425 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:35:28 np0005544118 nova_compute[187283]: 2025-12-03 14:35:28.426 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:35:29 np0005544118 nova_compute[187283]: 2025-12-03 14:35:29.802 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:31 np0005544118 nova_compute[187283]: 2025-12-03 14:35:31.884 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.611 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.611 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.612 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.612 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.612 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.613 187287 INFO nova.compute.manager [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Terminating instance#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.614 187287 DEBUG nova.compute.manager [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:35:34 np0005544118 kernel: tapa8cf6de3-f4 (unregistering): left promiscuous mode
Dec  3 09:35:34 np0005544118 NetworkManager[55710]: <info>  [1764772534.6453] device (tapa8cf6de3-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:35:34 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:34Z|00158|binding|INFO|Releasing lport a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c from this chassis (sb_readonly=0)
Dec  3 09:35:34 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:34Z|00159|binding|INFO|Setting lport a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c down in Southbound
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.651 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:34Z|00160|binding|INFO|Removing iface tapa8cf6de3-f4 ovn-installed in OVS
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.653 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.660 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:95:0d 10.100.0.3'], port_security=['fa:16:3e:78:95:0d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '801daf39-165b-40fd-85ba-84d4007360c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.662 104491 INFO neutron.agent.ovn.metadata.agent [-] Port a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.663 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.665 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.682 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d5c0a9-7bb4-4062-8a0d-21cf80c8a12c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Deactivated successfully.
Dec  3 09:35:34 np0005544118 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Consumed 14.152s CPU time.
Dec  3 09:35:34 np0005544118 systemd-machined[153602]: Machine qemu-13-instance-00000010 terminated.
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.712 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[98bb9bae-d449-4893-8200-ea833c2e6da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.714 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[440d638b-2d00-4421-afc2-9bccbda574e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.742 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[c712b363-fcf3-4ec8-b287-51d03a586af4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.758 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a9c88897-61d7-4bf0-98f5-824c6b945b97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463722, 'reachable_time': 38261, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214982, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.779 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[66e1e28f-fa37-40e4-bca3-84c5f72adc4b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463734, 'tstamp': 463734}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214983, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 463738, 'tstamp': 463738}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214983, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.781 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.819 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.824 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.824 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.825 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.825 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:34 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:34.825 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.836 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.840 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.882 187287 INFO nova.virt.libvirt.driver [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Instance destroyed successfully.#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.882 187287 DEBUG nova.objects.instance [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 801daf39-165b-40fd-85ba-84d4007360c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.899 187287 DEBUG nova.virt.libvirt.vif [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:34:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1921420340',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1921420340',id=16,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:34:42Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-tl5u3j0z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:34:43Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=801daf39-165b-40fd-85ba-84d4007360c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.900 187287 DEBUG nova.network.os_vif_util [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "address": "fa:16:3e:78:95:0d", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8cf6de3-f4", "ovs_interfaceid": "a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.900 187287 DEBUG nova.network.os_vif_util [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.901 187287 DEBUG os_vif [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.902 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.902 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8cf6de3-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.904 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.905 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.907 187287 INFO os_vif [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:78:95:0d,bridge_name='br-int',has_traffic_filtering=True,id=a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8cf6de3-f4')#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.907 187287 INFO nova.virt.libvirt.driver [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Deleting instance files /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1_del#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.908 187287 INFO nova.virt.libvirt.driver [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Deletion of /var/lib/nova/instances/801daf39-165b-40fd-85ba-84d4007360c1_del complete#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.953 187287 INFO nova.compute.manager [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.954 187287 DEBUG oslo.service.loopingcall [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.954 187287 DEBUG nova.compute.manager [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:35:34 np0005544118 nova_compute[187283]: 2025-12-03 14:35:34.954 187287 DEBUG nova.network.neutron [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:35:35 np0005544118 podman[197639]: time="2025-12-03T14:35:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:35:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:35:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:35:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:35:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3058 "" "Go-http-client/1.1"
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.824 187287 DEBUG nova.compute.manager [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-unplugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.825 187287 DEBUG oslo_concurrency.lockutils [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.825 187287 DEBUG oslo_concurrency.lockutils [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.825 187287 DEBUG oslo_concurrency.lockutils [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.825 187287 DEBUG nova.compute.manager [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] No waiting events found dispatching network-vif-unplugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:35:36 np0005544118 nova_compute[187283]: 2025-12-03 14:35:36.826 187287 DEBUG nova.compute.manager [req-2232eeef-cc3a-4f85-8bb0-6921cfb23938 req-16416403-aae5-43e4-9447-7e036a9ad09f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-unplugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:35:36 np0005544118 podman[215001]: 2025-12-03 14:35:36.863439665 +0000 UTC m=+0.077003898 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.6)
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.883 187287 DEBUG nova.network.neutron [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.924 187287 DEBUG nova.compute.manager [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.924 187287 DEBUG oslo_concurrency.lockutils [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "801daf39-165b-40fd-85ba-84d4007360c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.925 187287 DEBUG oslo_concurrency.lockutils [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.925 187287 DEBUG oslo_concurrency.lockutils [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.925 187287 DEBUG nova.compute.manager [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] No waiting events found dispatching network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.925 187287 WARNING nova.compute.manager [req-bfd07d39-47cc-431c-a61f-c2210d040685 req-43ae6531-84a2-491c-8627-33cf73734c58 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received unexpected event network-vif-plugged-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c for instance with vm_state active and task_state deleting.#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.927 187287 INFO nova.compute.manager [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Took 3.97 seconds to deallocate network for instance.#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.970 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:38 np0005544118 nova_compute[187283]: 2025-12-03 14:35:38.971 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.042 187287 DEBUG nova.compute.provider_tree [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.059 187287 DEBUG nova.scheduler.client.report [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.080 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.106 187287 INFO nova.scheduler.client.report [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 801daf39-165b-40fd-85ba-84d4007360c1#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.180 187287 DEBUG oslo_concurrency.lockutils [None req-8a5170e6-4793-442b-b351-5bb5723aee03 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "801daf39-165b-40fd-85ba-84d4007360c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.821 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:39 np0005544118 nova_compute[187283]: 2025-12-03 14:35:39.904 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.307 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.307 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.308 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.308 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.308 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.309 187287 INFO nova.compute.manager [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Terminating instance#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.310 187287 DEBUG nova.compute.manager [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:35:40 np0005544118 kernel: tapc30c508a-97 (unregistering): left promiscuous mode
Dec  3 09:35:40 np0005544118 NetworkManager[55710]: <info>  [1764772540.3562] device (tapc30c508a-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.358 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:40Z|00161|binding|INFO|Releasing lport c30c508a-97f3-4d74-96e9-967b4d03ed1a from this chassis (sb_readonly=0)
Dec  3 09:35:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:40Z|00162|binding|INFO|Setting lport c30c508a-97f3-4d74-96e9-967b4d03ed1a down in Southbound
Dec  3 09:35:40 np0005544118 ovn_controller[95637]: 2025-12-03T14:35:40Z|00163|binding|INFO|Removing iface tapc30c508a-97 ovn-installed in OVS
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.360 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.366 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:ec:05 10.100.0.11'], port_security=['fa:16:3e:6a:ec:05 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=c30c508a-97f3-4d74-96e9-967b4d03ed1a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.367 104491 INFO neutron.agent.ovn.metadata.agent [-] Port c30c508a-97f3-4d74-96e9-967b4d03ed1a in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.368 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.369 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1cec4f96-a932-486b-8bec-2b2ccc18f926]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.370 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.375 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Dec  3 09:35:40 np0005544118 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000000f.scope: Consumed 2.708s CPU time.
Dec  3 09:35:40 np0005544118 systemd-machined[153602]: Machine qemu-14-instance-0000000f terminated.
Dec  3 09:35:40 np0005544118 podman[215023]: 2025-12-03 14:35:40.481474471 +0000 UTC m=+0.091816917 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  3 09:35:40 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [NOTICE]   (214679) : haproxy version is 2.8.14-c23fe91
Dec  3 09:35:40 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [NOTICE]   (214679) : path to executable is /usr/sbin/haproxy
Dec  3 09:35:40 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [WARNING]  (214679) : Exiting Master process...
Dec  3 09:35:40 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [ALERT]    (214679) : Current worker (214681) exited with code 143 (Terminated)
Dec  3 09:35:40 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[214675]: [WARNING]  (214679) : All workers exited. Exiting... (0)
Dec  3 09:35:40 np0005544118 systemd[1]: libpod-29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747.scope: Deactivated successfully.
Dec  3 09:35:40 np0005544118 podman[215067]: 2025-12-03 14:35:40.496274529 +0000 UTC m=+0.042929574 container died 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125)
Dec  3 09:35:40 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747-userdata-shm.mount: Deactivated successfully.
Dec  3 09:35:40 np0005544118 systemd[1]: var-lib-containers-storage-overlay-f20bd18ab1fa9f58043381d88ee3b78ce0ad54394a4f7d8528eaefb271af6698-merged.mount: Deactivated successfully.
Dec  3 09:35:40 np0005544118 podman[215067]: 2025-12-03 14:35:40.542909147 +0000 UTC m=+0.089564192 container cleanup 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec  3 09:35:40 np0005544118 systemd[1]: libpod-conmon-29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747.scope: Deactivated successfully.
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.575 187287 INFO nova.virt.libvirt.driver [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Instance destroyed successfully.#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.576 187287 DEBUG nova.objects.instance [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.593 187287 DEBUG nova.virt.libvirt.vif [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:34:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1441281371',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1441281371',id=15,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:34:26Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-onhwgnyu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:35:28Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.593 187287 DEBUG nova.network.os_vif_util [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "address": "fa:16:3e:6a:ec:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc30c508a-97", "ovs_interfaceid": "c30c508a-97f3-4d74-96e9-967b4d03ed1a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.594 187287 DEBUG nova.network.os_vif_util [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.594 187287 DEBUG os_vif [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.596 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.596 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc30c508a-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.627 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.629 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.634 187287 INFO os_vif [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:ec:05,bridge_name='br-int',has_traffic_filtering=True,id=c30c508a-97f3-4d74-96e9-967b4d03ed1a,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc30c508a-97')#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.635 187287 INFO nova.virt.libvirt.driver [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Deleting instance files /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0_del#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.635 187287 INFO nova.virt.libvirt.driver [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Deletion of /var/lib/nova/instances/0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0_del complete#033[00m
Dec  3 09:35:40 np0005544118 podman[215105]: 2025-12-03 14:35:40.643209051 +0000 UTC m=+0.075818443 container remove 29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.648 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[60ebfc72-885a-491b-92d9-6d239b37e36c]: (4, ('Wed Dec  3 02:35:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747)\n29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747\nWed Dec  3 02:35:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747)\n29d60239c8409bf38c2bd58c23a90df91c13c94324c6b32828b2ce642fcc0747\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.650 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[859954c4-89ff-4939-93f2-00ea3d5f3bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.651 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:35:40 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.654 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.665 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.668 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8f468e3c-6189-46c9-9772-96d87cf0e06e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.682 187287 INFO nova.compute.manager [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.683 187287 DEBUG oslo.service.loopingcall [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.683 187287 DEBUG nova.compute.manager [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.684 187287 DEBUG nova.network.neutron [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.689 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b14b77bd-c75c-429a-84d1-828ef9443cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.691 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8ba6da10-2b9e-4884-9631-5829c1cd8f9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.706 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3905df-0685-4ca8-9e78-d09fdda79be7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 463715, 'reachable_time': 38883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215126, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.709 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:35:40 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:35:40.710 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[e43fcc12-a657-4202-938c-f450ab167abd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:35:40 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.898 187287 DEBUG nova.compute.manager [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Received event network-vif-unplugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.898 187287 DEBUG oslo_concurrency.lockutils [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.901 187287 DEBUG oslo_concurrency.lockutils [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.902 187287 DEBUG oslo_concurrency.lockutils [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.903 187287 DEBUG nova.compute.manager [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] No waiting events found dispatching network-vif-unplugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:35:40 np0005544118 nova_compute[187283]: 2025-12-03 14:35:40.903 187287 DEBUG nova.compute.manager [req-92d613f2-e6f4-41b2-b5e9-c2d9da37d352 req-8a0a1761-c234-414b-956d-04a90fb38a4b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Received event network-vif-unplugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.001 187287 DEBUG nova.compute.manager [req-1107e327-d575-4ce3-ab34-15ead6bf6d2b req-ddf8b4ca-be32-4fa6-af7a-40e504863dec c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Received event network-vif-deleted-a8cf6de3-f467-49cf-8a3c-d2f2bbcbc46c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.223 187287 DEBUG nova.network.neutron [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.248 187287 INFO nova.compute.manager [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Took 0.56 seconds to deallocate network for instance.#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.308 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.308 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.313 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.345 187287 INFO nova.scheduler.client.report [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0#033[00m
Dec  3 09:35:41 np0005544118 nova_compute[187283]: 2025-12-03 14:35:41.421 187287 DEBUG oslo_concurrency.lockutils [None req-a09fc27e-1ce1-42c4-92b6-47c03c335c3a 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.992 187287 DEBUG nova.compute.manager [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Received event network-vif-plugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.992 187287 DEBUG oslo_concurrency.lockutils [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.993 187287 DEBUG oslo_concurrency.lockutils [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.993 187287 DEBUG oslo_concurrency.lockutils [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.993 187287 DEBUG nova.compute.manager [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] No waiting events found dispatching network-vif-plugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:35:42 np0005544118 nova_compute[187283]: 2025-12-03 14:35:42.993 187287 WARNING nova.compute.manager [req-a9dc8bd2-6db5-4e6a-9251-030edc8d23b9 req-c0e9c8a8-64cf-406a-be48-7fc984e0fb62 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Received unexpected event network-vif-plugged-c30c508a-97f3-4d74-96e9-967b4d03ed1a for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:35:43 np0005544118 nova_compute[187283]: 2025-12-03 14:35:43.104 187287 DEBUG nova.compute.manager [req-b059bb13-c0e7-4c20-b808-3a5a2ab2cf59 req-9890b393-4ed0-43b1-b5c4-d9959dd75fa1 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Received event network-vif-deleted-c30c508a-97f3-4d74-96e9-967b4d03ed1a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:35:44 np0005544118 nova_compute[187283]: 2025-12-03 14:35:44.823 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:45 np0005544118 nova_compute[187283]: 2025-12-03 14:35:45.628 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:35:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:35:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:35:49 np0005544118 podman[215128]: 2025-12-03 14:35:49.824508552 +0000 UTC m=+0.051307751 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:35:49 np0005544118 nova_compute[187283]: 2025-12-03 14:35:49.825 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:49 np0005544118 nova_compute[187283]: 2025-12-03 14:35:49.880 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772534.8795962, 801daf39-165b-40fd-85ba-84d4007360c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:35:49 np0005544118 nova_compute[187283]: 2025-12-03 14:35:49.881 187287 INFO nova.compute.manager [-] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:35:49 np0005544118 nova_compute[187283]: 2025-12-03 14:35:49.902 187287 DEBUG nova.compute.manager [None req-92343633-113d-478f-bc98-ba82c66e5eb0 - - - - - -] [instance: 801daf39-165b-40fd-85ba-84d4007360c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:35:50 np0005544118 nova_compute[187283]: 2025-12-03 14:35:50.663 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:53 np0005544118 podman[215147]: 2025-12-03 14:35:53.838739206 +0000 UTC m=+0.074946910 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:35:54 np0005544118 nova_compute[187283]: 2025-12-03 14:35:54.826 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:55 np0005544118 nova_compute[187283]: 2025-12-03 14:35:55.574 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772540.5734923, 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:35:55 np0005544118 nova_compute[187283]: 2025-12-03 14:35:55.575 187287 INFO nova.compute.manager [-] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:35:55 np0005544118 nova_compute[187283]: 2025-12-03 14:35:55.598 187287 DEBUG nova.compute.manager [None req-35a5c809-2e41-4f61-ae0e-c9d232a618b6 - - - - - -] [instance: 0a10b6c1-1967-4870-8ad5-6fcf12a3d4a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:35:55 np0005544118 nova_compute[187283]: 2025-12-03 14:35:55.665 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:35:57 np0005544118 podman[215171]: 2025-12-03 14:35:57.887626109 +0000 UTC m=+0.113107477 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  3 09:35:59 np0005544118 nova_compute[187283]: 2025-12-03 14:35:59.828 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:00 np0005544118 nova_compute[187283]: 2025-12-03 14:36:00.667 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:00.968 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:00.968 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:00.968 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:04 np0005544118 nova_compute[187283]: 2025-12-03 14:36:04.830 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:05 np0005544118 podman[197639]: time="2025-12-03T14:36:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:36:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:36:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:36:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:36:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2595 "" "Go-http-client/1.1"
Dec  3 09:36:05 np0005544118 nova_compute[187283]: 2025-12-03 14:36:05.669 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:07 np0005544118 podman[215198]: 2025-12-03 14:36:07.829400063 +0000 UTC m=+0.054141752 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, release=1755695350, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible)
Dec  3 09:36:09 np0005544118 nova_compute[187283]: 2025-12-03 14:36:09.832 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:10 np0005544118 nova_compute[187283]: 2025-12-03 14:36:10.671 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:10 np0005544118 podman[215220]: 2025-12-03 14:36:10.841936306 +0000 UTC m=+0.068603980 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  3 09:36:11 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:11Z|00164|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  3 09:36:14 np0005544118 nova_compute[187283]: 2025-12-03 14:36:14.833 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:15 np0005544118 nova_compute[187283]: 2025-12-03 14:36:15.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:15 np0005544118 nova_compute[187283]: 2025-12-03 14:36:15.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:36:15 np0005544118 nova_compute[187283]: 2025-12-03 14:36:15.673 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:16 np0005544118 nova_compute[187283]: 2025-12-03 14:36:16.628 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:18 np0005544118 nova_compute[187283]: 2025-12-03 14:36:18.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:36:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:36:19 np0005544118 nova_compute[187283]: 2025-12-03 14:36:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:19 np0005544118 nova_compute[187283]: 2025-12-03 14:36:19.834 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.628 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.630 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:20 np0005544118 nova_compute[187283]: 2025-12-03 14:36:20.675 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:20 np0005544118 podman[215241]: 2025-12-03 14:36:20.826442067 +0000 UTC m=+0.053246695 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  3 09:36:21 np0005544118 nova_compute[187283]: 2025-12-03 14:36:21.620 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:23 np0005544118 nova_compute[187283]: 2025-12-03 14:36:23.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.629 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.630 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:36:24 np0005544118 podman[215262]: 2025-12-03 14:36:24.733194679 +0000 UTC m=+0.060163355 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.818 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.820 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5870MB free_disk=73.3363151550293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.820 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.820 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.835 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.881 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.881 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.902 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.915 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.935 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.936 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.936 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.937 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:36:24 np0005544118 nova_compute[187283]: 2025-12-03 14:36:24.951 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:36:25 np0005544118 nova_compute[187283]: 2025-12-03 14:36:25.677 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:25 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:25.931 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:36:25 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:25.931 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:36:25 np0005544118 nova_compute[187283]: 2025-12-03 14:36:25.932 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:25 np0005544118 nova_compute[187283]: 2025-12-03 14:36:25.946 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:28 np0005544118 podman[215284]: 2025-12-03 14:36:28.864794052 +0000 UTC m=+0.090989767 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec  3 09:36:29 np0005544118 nova_compute[187283]: 2025-12-03 14:36:29.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:36:29 np0005544118 nova_compute[187283]: 2025-12-03 14:36:29.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:36:29 np0005544118 nova_compute[187283]: 2025-12-03 14:36:29.880 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:30 np0005544118 nova_compute[187283]: 2025-12-03 14:36:30.679 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:32.934 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.116 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.116 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.134 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.212 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.213 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.220 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.220 187287 INFO nova.compute.claims [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.376 187287 DEBUG nova.compute.provider_tree [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.401 187287 DEBUG nova.scheduler.client.report [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.434 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.435 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.499 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.500 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.523 187287 INFO nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.548 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.659 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.660 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.661 187287 INFO nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Creating image(s)#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.661 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.662 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.663 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.675 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.693 187287 DEBUG nova.policy [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.729 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.730 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.731 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.748 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.801 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.802 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.836 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.837 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.838 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.894 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.896 187287 DEBUG nova.virt.disk.api [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.896 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.925 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.952 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.953 187287 DEBUG nova.virt.disk.api [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.953 187287 DEBUG nova.objects.instance [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid ae37f28e-8d34-4805-a71f-6dd09be662f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.971 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.972 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Ensure instance console log exists: /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.972 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.972 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:34 np0005544118 nova_compute[187283]: 2025-12-03 14:36:34.973 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:35 np0005544118 nova_compute[187283]: 2025-12-03 14:36:35.208 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Successfully created port: b7fc3b79-84c5-4634-8b34-305ee3df6a85 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:36:35 np0005544118 podman[197639]: time="2025-12-03T14:36:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:36:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:36:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:36:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:36:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec  3 09:36:35 np0005544118 nova_compute[187283]: 2025-12-03 14:36:35.681 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.776 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Successfully updated port: b7fc3b79-84c5-4634-8b34-305ee3df6a85 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.794 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.794 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.794 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.848 187287 DEBUG nova.compute.manager [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-changed-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.849 187287 DEBUG nova.compute.manager [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Refreshing instance network info cache due to event network-changed-b7fc3b79-84c5-4634-8b34-305ee3df6a85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.849 187287 DEBUG oslo_concurrency.lockutils [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:36:37 np0005544118 nova_compute[187283]: 2025-12-03 14:36:37.916 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:36:38 np0005544118 podman[215325]: 2025-12-03 14:36:38.830479729 +0000 UTC m=+0.066181404 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, config_id=edpm, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7)
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.668 187287 DEBUG nova.network.neutron [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating instance_info_cache with network_info: [{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.689 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.689 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Instance network_info: |[{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.690 187287 DEBUG oslo_concurrency.lockutils [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.690 187287 DEBUG nova.network.neutron [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Refreshing network info cache for port b7fc3b79-84c5-4634-8b34-305ee3df6a85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.692 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Start _get_guest_xml network_info=[{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.697 187287 WARNING nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.702 187287 DEBUG nova.virt.libvirt.host [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.702 187287 DEBUG nova.virt.libvirt.host [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.707 187287 DEBUG nova.virt.libvirt.host [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.708 187287 DEBUG nova.virt.libvirt.host [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.709 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.709 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.710 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.710 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.710 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.710 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.711 187287 DEBUG nova.virt.hardware [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.716 187287 DEBUG nova.virt.libvirt.vif [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:36:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1692745364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1692745364',id=18,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-9iv1ypti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:36:34Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=ae37f28e-8d34-4805-a71f-6dd09be662f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.716 187287 DEBUG nova.network.os_vif_util [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.717 187287 DEBUG nova.network.os_vif_util [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.718 187287 DEBUG nova.objects.instance [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid ae37f28e-8d34-4805-a71f-6dd09be662f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.732 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <uuid>ae37f28e-8d34-4805-a71f-6dd09be662f5</uuid>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <name>instance-00000012</name>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-1692745364</nova:name>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:36:39</nova:creationTime>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        <nova:port uuid="b7fc3b79-84c5-4634-8b34-305ee3df6a85">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="serial">ae37f28e-8d34-4805-a71f-6dd09be662f5</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="uuid">ae37f28e-8d34-4805-a71f-6dd09be662f5</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.config"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:0c:24:05"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <target dev="tapb7fc3b79-84"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/console.log" append="off"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:36:39 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:36:39 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:36:39 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:36:39 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.733 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Preparing to wait for external event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.733 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.733 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.734 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.734 187287 DEBUG nova.virt.libvirt.vif [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:36:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1692745364',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1692745364',id=18,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-9iv1ypti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:36:34Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=ae37f28e-8d34-4805-a71f-6dd09be662f5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.735 187287 DEBUG nova.network.os_vif_util [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.735 187287 DEBUG nova.network.os_vif_util [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.736 187287 DEBUG os_vif [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.736 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.737 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.737 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.741 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.741 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb7fc3b79-84, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.742 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb7fc3b79-84, col_values=(('external_ids', {'iface-id': 'b7fc3b79-84c5-4634-8b34-305ee3df6a85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0c:24:05', 'vm-uuid': 'ae37f28e-8d34-4805-a71f-6dd09be662f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.743 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:39 np0005544118 NetworkManager[55710]: <info>  [1764772599.7445] manager: (tapb7fc3b79-84): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.746 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.749 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.750 187287 INFO os_vif [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84')#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.833 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.834 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.834 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:0c:24:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.835 187287 INFO nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Using config drive#033[00m
Dec  3 09:36:39 np0005544118 nova_compute[187283]: 2025-12-03 14:36:39.926 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.697 187287 INFO nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Creating config drive at /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.config#033[00m
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.703 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3c9tr6jo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:36:41 np0005544118 podman[215352]: 2025-12-03 14:36:41.821337887 +0000 UTC m=+0.054675717 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.827 187287 DEBUG oslo_concurrency.processutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3c9tr6jo" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:36:41 np0005544118 kernel: tapb7fc3b79-84: entered promiscuous mode
Dec  3 09:36:41 np0005544118 NetworkManager[55710]: <info>  [1764772601.8801] manager: (tapb7fc3b79-84): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.880 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:41Z|00165|binding|INFO|Claiming lport b7fc3b79-84c5-4634-8b34-305ee3df6a85 for this chassis.
Dec  3 09:36:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:41Z|00166|binding|INFO|b7fc3b79-84c5-4634-8b34-305ee3df6a85: Claiming fa:16:3e:0c:24:05 10.100.0.5
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.883 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:41Z|00167|binding|INFO|Setting lport b7fc3b79-84c5-4634-8b34-305ee3df6a85 ovn-installed in OVS
Dec  3 09:36:41 np0005544118 nova_compute[187283]: 2025-12-03 14:36:41.894 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:41 np0005544118 systemd-udevd[215386]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:36:41 np0005544118 systemd-machined[153602]: New machine qemu-15-instance-00000012.
Dec  3 09:36:41 np0005544118 NetworkManager[55710]: <info>  [1764772601.9178] device (tapb7fc3b79-84): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:36:41 np0005544118 NetworkManager[55710]: <info>  [1764772601.9188] device (tapb7fc3b79-84): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:36:41 np0005544118 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Dec  3 09:36:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:41Z|00168|binding|INFO|Setting lport b7fc3b79-84c5-4634-8b34-305ee3df6a85 up in Southbound
Dec  3 09:36:41 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:41.999 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:24:05 10.100.0.5'], port_security=['fa:16:3e:0c:24:05 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ae37f28e-8d34-4805-a71f-6dd09be662f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=b7fc3b79-84c5-4634-8b34-305ee3df6a85) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.000 104491 INFO neutron.agent.ovn.metadata.agent [-] Port b7fc3b79-84c5-4634-8b34-305ee3df6a85 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.001 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.013 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8371a789-2358-4df1-8e65-2f82f773b86d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.014 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.017 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.017 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[12b815a8-dbe9-4f05-a2a7-0c42415ce055]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.018 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[089bfe77-69cb-4881-a921-3cb73e861675]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.029 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[2a3e67e6-91ef-4af7-8098-433fa99f1b73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.053 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2e49490e-d3ca-412a-b260-1b7142e58d18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.079 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[70784683-bbf4-4649-8b58-2183cb7ed617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.084 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8742fb8c-fc2d-4994-a9fa-bb9f374377eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 systemd-udevd[215389]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:36:42 np0005544118 NetworkManager[55710]: <info>  [1764772602.0858] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.117 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[dcb7e706-e394-499d-b2b2-bb64bdf54b52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.121 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[14a70193-f58d-402c-9f5a-10f3ec5e479a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 NetworkManager[55710]: <info>  [1764772602.1473] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.151 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[600fd54c-abc0-4991-86ba-df6330f58ef2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.168 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3f40e264-7088-40b1-a24d-57ab701c6cc5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475775, 'reachable_time': 38470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215427, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.178 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772602.178269, ae37f28e-8d34-4805-a71f-6dd09be662f5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.179 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] VM Started (Lifecycle Event)#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.184 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[01ead414-5595-4fd0-bea5-56f9cce74165]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475775, 'tstamp': 475775}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215428, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.199 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6c443e81-2e2e-483d-a8ef-560534e9b380]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475775, 'reachable_time': 38470, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215429, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.228 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[74c2d63a-26ea-41a9-82f7-0c64ecb39778]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.231 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.235 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772602.1783872, ae37f28e-8d34-4805-a71f-6dd09be662f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.235 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.253 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.257 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.275 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.282 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6dbcd3-b628-48bd-a84f-f814ef6a69b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.284 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.284 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.284 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.286 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:42 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:36:42 np0005544118 NetworkManager[55710]: <info>  [1764772602.2882] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.289 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.289 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:36:42 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:42Z|00169|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.290 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.292 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.293 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[24650e46-faf4-4089-ac6e-69865fee70e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.293 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:36:42 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:36:42.294 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:36:42 np0005544118 nova_compute[187283]: 2025-12-03 14:36:42.301 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:42 np0005544118 podman[215461]: 2025-12-03 14:36:42.623162032 +0000 UTC m=+0.021602456 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:36:42 np0005544118 podman[215461]: 2025-12-03 14:36:42.756740241 +0000 UTC m=+0.155180645 container create 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec  3 09:36:42 np0005544118 systemd[1]: Started libpod-conmon-93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed.scope.
Dec  3 09:36:42 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:36:42 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d3f877800845bc8bcba189b81575090365164ee1d193d544c94be8b71888b6c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:36:42 np0005544118 podman[215461]: 2025-12-03 14:36:42.878706881 +0000 UTC m=+0.277147315 container init 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  3 09:36:42 np0005544118 podman[215461]: 2025-12-03 14:36:42.884092169 +0000 UTC m=+0.282532573 container start 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Dec  3 09:36:42 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [NOTICE]   (215480) : New worker (215482) forked
Dec  3 09:36:42 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [NOTICE]   (215480) : Loading success.
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.484 187287 DEBUG nova.compute.manager [req-75e854c6-7738-4f01-a781-4ffe64eb5050 req-5c4bb92e-fd0c-45eb-84b9-f17d61e653ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.484 187287 DEBUG oslo_concurrency.lockutils [req-75e854c6-7738-4f01-a781-4ffe64eb5050 req-5c4bb92e-fd0c-45eb-84b9-f17d61e653ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.485 187287 DEBUG oslo_concurrency.lockutils [req-75e854c6-7738-4f01-a781-4ffe64eb5050 req-5c4bb92e-fd0c-45eb-84b9-f17d61e653ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.485 187287 DEBUG oslo_concurrency.lockutils [req-75e854c6-7738-4f01-a781-4ffe64eb5050 req-5c4bb92e-fd0c-45eb-84b9-f17d61e653ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.485 187287 DEBUG nova.compute.manager [req-75e854c6-7738-4f01-a781-4ffe64eb5050 req-5c4bb92e-fd0c-45eb-84b9-f17d61e653ac c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Processing event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.486 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.490 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772603.4898505, ae37f28e-8d34-4805-a71f-6dd09be662f5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.490 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.492 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.495 187287 INFO nova.virt.libvirt.driver [-] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Instance spawned successfully.#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.495 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.557 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.562 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.565 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.565 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.566 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.566 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.567 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.567 187287 DEBUG nova.virt.libvirt.driver [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.761 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.801 187287 INFO nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Took 9.14 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:36:43 np0005544118 nova_compute[187283]: 2025-12-03 14:36:43.802 187287 DEBUG nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.108 187287 INFO nova.compute.manager [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Took 9.92 seconds to build instance.#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.206 187287 DEBUG nova.network.neutron [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updated VIF entry in instance network info cache for port b7fc3b79-84c5-4634-8b34-305ee3df6a85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.207 187287 DEBUG nova.network.neutron [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating instance_info_cache with network_info: [{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.264 187287 DEBUG oslo_concurrency.lockutils [None req-396ca414-645e-4800-ab50-308e844bd988 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.342 187287 DEBUG oslo_concurrency.lockutils [req-56d36f62-1ba2-40fc-8faa-3e3766d5643b req-b84a8136-ddcb-4639-a3c3-5d6299a3516b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.744 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:44 np0005544118 nova_compute[187283]: 2025-12-03 14:36:44.929 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.585 187287 DEBUG nova.compute.manager [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.586 187287 DEBUG oslo_concurrency.lockutils [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.586 187287 DEBUG oslo_concurrency.lockutils [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.586 187287 DEBUG oslo_concurrency.lockutils [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.586 187287 DEBUG nova.compute.manager [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:36:45 np0005544118 nova_compute[187283]: 2025-12-03 14:36:45.587 187287 WARNING nova.compute.manager [req-c0f743e2-ac20-4e8b-9c1b-484e56906e0d req-6a02df59-8d2f-4a1e-9238-7bf3481dc12a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:36:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:36:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:36:49 np0005544118 nova_compute[187283]: 2025-12-03 14:36:49.784 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:49 np0005544118 nova_compute[187283]: 2025-12-03 14:36:49.931 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:51 np0005544118 podman[215491]: 2025-12-03 14:36:51.851188473 +0000 UTC m=+0.083210864 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:36:54 np0005544118 nova_compute[187283]: 2025-12-03 14:36:54.788 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:54 np0005544118 podman[215510]: 2025-12-03 14:36:54.834817372 +0000 UTC m=+0.053334520 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:36:54 np0005544118 nova_compute[187283]: 2025-12-03 14:36:54.932 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:57Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0c:24:05 10.100.0.5
Dec  3 09:36:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:36:57Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0c:24:05 10.100.0.5
Dec  3 09:36:59 np0005544118 nova_compute[187283]: 2025-12-03 14:36:59.836 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:36:59 np0005544118 podman[215553]: 2025-12-03 14:36:59.892478396 +0000 UTC m=+0.128616654 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:36:59 np0005544118 nova_compute[187283]: 2025-12-03 14:36:59.933 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:00.969 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:00.970 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:00.971 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:04 np0005544118 nova_compute[187283]: 2025-12-03 14:37:04.839 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:04 np0005544118 nova_compute[187283]: 2025-12-03 14:37:04.936 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:05 np0005544118 podman[197639]: time="2025-12-03T14:37:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:37:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:37:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:37:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:37:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3064 "" "Go-http-client/1.1"
Dec  3 09:37:09 np0005544118 podman[215579]: 2025-12-03 14:37:09.821442332 +0000 UTC m=+0.050196983 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git)
Dec  3 09:37:09 np0005544118 nova_compute[187283]: 2025-12-03 14:37:09.841 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:09 np0005544118 nova_compute[187283]: 2025-12-03 14:37:09.938 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:11 np0005544118 ovn_controller[95637]: 2025-12-03T14:37:11Z|00170|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  3 09:37:12 np0005544118 podman[215600]: 2025-12-03 14:37:12.81845556 +0000 UTC m=+0.055587983 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:37:14 np0005544118 nova_compute[187283]: 2025-12-03 14:37:14.843 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:14 np0005544118 nova_compute[187283]: 2025-12-03 14:37:14.939 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:15 np0005544118 nova_compute[187283]: 2025-12-03 14:37:15.232 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Check if temp file /var/lib/nova/instances/tmpiz2v8okp exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  3 09:37:15 np0005544118 nova_compute[187283]: 2025-12-03 14:37:15.233 187287 DEBUG nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpiz2v8okp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae37f28e-8d34-4805-a71f-6dd09be662f5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  3 09:37:15 np0005544118 nova_compute[187283]: 2025-12-03 14:37:15.970 187287 DEBUG oslo_concurrency.processutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:37:16 np0005544118 nova_compute[187283]: 2025-12-03 14:37:16.031 187287 DEBUG oslo_concurrency.processutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:37:16 np0005544118 nova_compute[187283]: 2025-12-03 14:37:16.032 187287 DEBUG oslo_concurrency.processutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:37:16 np0005544118 nova_compute[187283]: 2025-12-03 14:37:16.085 187287 DEBUG oslo_concurrency.processutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:37:17 np0005544118 systemd[1]: Created slice User Slice of UID 42436.
Dec  3 09:37:17 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  3 09:37:17 np0005544118 systemd-logind[795]: New session 30 of user nova.
Dec  3 09:37:17 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  3 09:37:17 np0005544118 systemd[1]: Starting User Manager for UID 42436...
Dec  3 09:37:17 np0005544118 systemd[215632]: Queued start job for default target Main User Target.
Dec  3 09:37:17 np0005544118 systemd[215632]: Created slice User Application Slice.
Dec  3 09:37:17 np0005544118 systemd[215632]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:37:17 np0005544118 systemd[215632]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 09:37:17 np0005544118 systemd[215632]: Reached target Paths.
Dec  3 09:37:17 np0005544118 systemd[215632]: Reached target Timers.
Dec  3 09:37:17 np0005544118 systemd[215632]: Starting D-Bus User Message Bus Socket...
Dec  3 09:37:17 np0005544118 systemd[215632]: Starting Create User's Volatile Files and Directories...
Dec  3 09:37:17 np0005544118 systemd[215632]: Listening on D-Bus User Message Bus Socket.
Dec  3 09:37:17 np0005544118 systemd[215632]: Reached target Sockets.
Dec  3 09:37:17 np0005544118 systemd[215632]: Finished Create User's Volatile Files and Directories.
Dec  3 09:37:17 np0005544118 systemd[215632]: Reached target Basic System.
Dec  3 09:37:17 np0005544118 systemd[215632]: Reached target Main User Target.
Dec  3 09:37:17 np0005544118 systemd[215632]: Startup finished in 136ms.
Dec  3 09:37:17 np0005544118 systemd[1]: Started User Manager for UID 42436.
Dec  3 09:37:17 np0005544118 systemd[1]: Started Session 30 of User nova.
Dec  3 09:37:18 np0005544118 systemd[1]: session-30.scope: Deactivated successfully.
Dec  3 09:37:18 np0005544118 systemd-logind[795]: Session 30 logged out. Waiting for processes to exit.
Dec  3 09:37:18 np0005544118 systemd-logind[795]: Removed session 30.
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.569 187287 DEBUG nova.compute.manager [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.570 187287 DEBUG oslo_concurrency.lockutils [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.570 187287 DEBUG oslo_concurrency.lockutils [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.570 187287 DEBUG oslo_concurrency.lockutils [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.570 187287 DEBUG nova.compute.manager [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.571 187287 DEBUG nova.compute.manager [req-c316a693-4529-4375-9d03-5eecd01455b0 req-4ea16ac2-1549-44d4-b13d-83c4ec754a31 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.977 187287 INFO nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Took 2.89 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.977 187287 DEBUG nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:37:18 np0005544118 nova_compute[187283]: 2025-12-03 14:37:18.996 187287 DEBUG nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpiz2v8okp',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ae37f28e-8d34-4805-a71f-6dd09be662f5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(589749ad-91c9-44ff-9a44-6c1928321630),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.018 187287 DEBUG nova.objects.instance [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid ae37f28e-8d34-4805-a71f-6dd09be662f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.019 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.021 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.021 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.039 187287 DEBUG nova.virt.libvirt.vif [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:36:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1692745364',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1692745364',id=18,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:36:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-9iv1ypti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:36:43Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=ae37f28e-8d34-4805-a71f-6dd09be662f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.040 187287 DEBUG nova.network.os_vif_util [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.041 187287 DEBUG nova.network.os_vif_util [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.041 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating guest XML with vif config: <interface type="ethernet">
Dec  3 09:37:19 np0005544118 nova_compute[187283]:  <mac address="fa:16:3e:0c:24:05"/>
Dec  3 09:37:19 np0005544118 nova_compute[187283]:  <model type="virtio"/>
Dec  3 09:37:19 np0005544118 nova_compute[187283]:  <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:37:19 np0005544118 nova_compute[187283]:  <mtu size="1442"/>
Dec  3 09:37:19 np0005544118 nova_compute[187283]:  <target dev="tapb7fc3b79-84"/>
Dec  3 09:37:19 np0005544118 nova_compute[187283]: </interface>
Dec  3 09:37:19 np0005544118 nova_compute[187283]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.042 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:37:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.524 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.524 187287 INFO nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.580 187287 INFO nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.845 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:19 np0005544118 nova_compute[187283]: 2025-12-03 14:37:19.940 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.084 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.085 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.555 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772640.5542257, ae37f28e-8d34-4805-a71f-6dd09be662f5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.556 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.574 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.579 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.588 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.589 187287 DEBUG nova.virt.libvirt.migration [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.603 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.657 187287 DEBUG nova.compute.manager [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.658 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.658 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.658 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.658 187287 DEBUG nova.compute.manager [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.659 187287 WARNING nova.compute.manager [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.659 187287 DEBUG nova.compute.manager [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-changed-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.659 187287 DEBUG nova.compute.manager [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Refreshing instance network info cache due to event network-changed-b7fc3b79-84c5-4634-8b34-305ee3df6a85. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.659 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.660 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.660 187287 DEBUG nova.network.neutron [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Refreshing network info cache for port b7fc3b79-84c5-4634-8b34-305ee3df6a85 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:37:20 np0005544118 kernel: tapb7fc3b79-84 (unregistering): left promiscuous mode
Dec  3 09:37:20 np0005544118 NetworkManager[55710]: <info>  [1764772640.7148] device (tapb7fc3b79-84): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.726 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:37:20Z|00171|binding|INFO|Releasing lport b7fc3b79-84c5-4634-8b34-305ee3df6a85 from this chassis (sb_readonly=0)
Dec  3 09:37:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:37:20Z|00172|binding|INFO|Setting lport b7fc3b79-84c5-4634-8b34-305ee3df6a85 down in Southbound
Dec  3 09:37:20 np0005544118 ovn_controller[95637]: 2025-12-03T14:37:20Z|00173|binding|INFO|Removing iface tapb7fc3b79-84 ovn-installed in OVS
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.728 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:20.733 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0c:24:05 10.100.0.5'], port_security=['fa:16:3e:0c:24:05 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3a9d7e7b-04f9-4aed-a199-9003ff5fe58c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'ae37f28e-8d34-4805-a71f-6dd09be662f5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=b7fc3b79-84c5-4634-8b34-305ee3df6a85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:37:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:20.735 104491 INFO neutron.agent.ovn.metadata.agent [-] Port b7fc3b79-84c5-4634-8b34-305ee3df6a85 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:37:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:20.737 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:37:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:20.740 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a593d6a2-3996-47d6-b610-44ed2a0c2e68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:20.741 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.745 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Dec  3 09:37:20 np0005544118 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 14.692s CPU time.
Dec  3 09:37:20 np0005544118 systemd-machined[153602]: Machine qemu-15-instance-00000012 terminated.
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [NOTICE]   (215480) : haproxy version is 2.8.14-c23fe91
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [NOTICE]   (215480) : path to executable is /usr/sbin/haproxy
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [WARNING]  (215480) : Exiting Master process...
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [WARNING]  (215480) : Exiting Master process...
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [ALERT]    (215480) : Current worker (215482) exited with code 143 (Terminated)
Dec  3 09:37:20 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[215476]: [WARNING]  (215480) : All workers exited. Exiting... (0)
Dec  3 09:37:20 np0005544118 systemd[1]: libpod-93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed.scope: Deactivated successfully.
Dec  3 09:37:20 np0005544118 podman[215679]: 2025-12-03 14:37:20.900234558 +0000 UTC m=+0.056130616 container died 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.919 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.924 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:20 np0005544118 systemd[1]: var-lib-containers-storage-overlay-8d3f877800845bc8bcba189b81575090365164ee1d193d544c94be8b71888b6c-merged.mount: Deactivated successfully.
Dec  3 09:37:20 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed-userdata-shm.mount: Deactivated successfully.
Dec  3 09:37:20 np0005544118 podman[215679]: 2025-12-03 14:37:20.962933436 +0000 UTC m=+0.118829464 container cleanup 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.963 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.964 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.964 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  3 09:37:20 np0005544118 systemd[1]: libpod-conmon-93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed.scope: Deactivated successfully.
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.998 187287 DEBUG nova.compute.manager [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.999 187287 DEBUG oslo_concurrency.lockutils [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.999 187287 DEBUG oslo_concurrency.lockutils [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.999 187287 DEBUG oslo_concurrency.lockutils [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.999 187287 DEBUG nova.compute.manager [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:20 np0005544118 nova_compute[187283]: 2025-12-03 14:37:20.999 187287 DEBUG nova.compute.manager [req-ce570d33-5578-455c-b63c-c6790ea62066 req-11045e8a-0299-466e-97bd-1bcc15821a87 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:37:21 np0005544118 podman[215723]: 2025-12-03 14:37:21.02699279 +0000 UTC m=+0.044087045 container remove 93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.033 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[85460c31-6e4b-48f7-bc07-26c8f7f6aeda]: (4, ('Wed Dec  3 02:37:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed)\n93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed\nWed Dec  3 02:37:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed)\n93dfa08288a75d1d80305e6a34475132453d0dd272288cd1390c1a9ba25165ed\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.035 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3680b0c2-7409-42c2-9bfc-31cfe2e605d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.036 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.038 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:21 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.052 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.054 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.057 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fac376ea-b22e-43b3-a0f6-0b2612c5a5b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.073 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5388255d-cf90-4185-9680-814474ff7495]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.076 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[82f8a255-2b2f-4e25-bd60-a7aa05071be2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.092 187287 DEBUG nova.virt.libvirt.guest [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'ae37f28e-8d34-4805-a71f-6dd09be662f5' (instance-00000012) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.092 187287 INFO nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migration operation has completed#033[00m
Dec  3 09:37:21 np0005544118 nova_compute[187283]: 2025-12-03 14:37:21.092 187287 INFO nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] _post_live_migration() is started..#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.093 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[968ba898-364b-45e3-a506-c76c108ee7a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475768, 'reachable_time': 15245, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215738, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:21 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.098 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:37:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:21.098 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[4479a49d-87c0-419c-befe-86cc54953a99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.091 187287 DEBUG nova.network.neutron [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Activated binding for port b7fc3b79-84c5-4634-8b34-305ee3df6a85 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.092 187287 DEBUG nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.093 187287 DEBUG nova.virt.libvirt.vif [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:36:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1692745364',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1692745364',id=18,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:36:43Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-9iv1ypti',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:37:12Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=ae37f28e-8d34-4805-a71f-6dd09be662f5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.093 187287 DEBUG nova.network.os_vif_util [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.094 187287 DEBUG nova.network.os_vif_util [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.094 187287 DEBUG os_vif [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.096 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb7fc3b79-84, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.097 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.099 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.104 187287 INFO os_vif [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0c:24:05,bridge_name='br-int',has_traffic_filtering=True,id=b7fc3b79-84c5-4634-8b34-305ee3df6a85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb7fc3b79-84')#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.105 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.105 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.105 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.105 187287 DEBUG nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.106 187287 INFO nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Deleting instance files /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5_del#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.106 187287 INFO nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Deletion of /var/lib/nova/instances/ae37f28e-8d34-4805-a71f-6dd09be662f5_del complete#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.321 187287 DEBUG nova.network.neutron [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updated VIF entry in instance network info cache for port b7fc3b79-84c5-4634-8b34-305ee3df6a85. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.321 187287 DEBUG nova.network.neutron [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating instance_info_cache with network_info: [{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.337 187287 DEBUG oslo_concurrency.lockutils [req-6ccbf72c-7677-44e5-b1a4-29d9006e252d req-443d3283-d407-4351-a0b0-d91278cf41a9 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.623 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.624 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.624 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.624 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae37f28e-8d34-4805-a71f-6dd09be662f5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.761 187287 DEBUG nova.compute.manager [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.761 187287 DEBUG oslo_concurrency.lockutils [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.761 187287 DEBUG oslo_concurrency.lockutils [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.761 187287 DEBUG oslo_concurrency.lockutils [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.762 187287 DEBUG nova.compute.manager [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:22 np0005544118 nova_compute[187283]: 2025-12-03 14:37:22.762 187287 DEBUG nova.compute.manager [req-5691e68a-fdcd-4a20-abfc-32d2c923a6dd req-883d02c4-0f29-451b-a536-aa0ba5ef91aa c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-unplugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:37:22 np0005544118 podman[215739]: 2025-12-03 14:37:22.819197284 +0000 UTC m=+0.051452758 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec  3 09:37:22 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.302 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.302 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.303 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.303 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.303 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.303 187287 WARNING nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.304 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.304 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.304 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.304 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.304 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.305 187287 WARNING nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.305 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.305 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.305 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.306 187287 DEBUG oslo_concurrency.lockutils [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.306 187287 DEBUG nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:23 np0005544118 nova_compute[187283]: 2025-12-03 14:37:23.306 187287 WARNING nova.compute.manager [req-f54b3fe5-4109-4944-ad01-154740b07352 req-54c289cd-b28b-42f0-9f18-ab2999a2b559 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.336 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating instance_info_cache with network_info: [{"id": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "address": "fa:16:3e:0c:24:05", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb7fc3b79-84", "ovs_interfaceid": "b7fc3b79-84c5-4634-8b34-305ee3df6a85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.353 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-ae37f28e-8d34-4805-a71f-6dd09be662f5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.353 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.354 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.354 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.637 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.797 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.798 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5870MB free_disk=73.33630752563477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.798 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.798 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.860 187287 INFO nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Updating resource usage from migration 589749ad-91c9-44ff-9a44-6c1928321630#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.933 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Migration 589749ad-91c9-44ff-9a44-6c1928321630 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.933 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.934 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.941 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.971 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:37:24 np0005544118 nova_compute[187283]: 2025-12-03 14:37:24.986 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.006 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.006 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.396 187287 DEBUG nova.compute.manager [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.396 187287 DEBUG oslo_concurrency.lockutils [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.396 187287 DEBUG oslo_concurrency.lockutils [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.396 187287 DEBUG oslo_concurrency.lockutils [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.397 187287 DEBUG nova.compute.manager [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] No waiting events found dispatching network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:37:25 np0005544118 nova_compute[187283]: 2025-12-03 14:37:25.397 187287 WARNING nova.compute.manager [req-e6fccb05-caca-4dc0-93c3-e414659c5588 req-059d70e5-e7ac-49c9-9451-c2a1bf6531cf c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Received unexpected event network-vif-plugged-b7fc3b79-84c5-4634-8b34-305ee3df6a85 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:37:25 np0005544118 podman[215761]: 2025-12-03 14:37:25.821375884 +0000 UTC m=+0.052888888 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.765 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.765 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.765 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ae37f28e-8d34-4805-a71f-6dd09be662f5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.789 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.789 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.790 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.790 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.941 187287 WARNING nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.942 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5865MB free_disk=73.3363265991211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.942 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.942 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:37:26 np0005544118 nova_compute[187283]: 2025-12-03 14:37:26.987 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration for instance ae37f28e-8d34-4805-a71f-6dd09be662f5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.001 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.010 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.035 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Migration 589749ad-91c9-44ff-9a44-6c1928321630 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.036 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.036 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.074 187287 DEBUG nova.compute.provider_tree [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.088 187287 DEBUG nova.scheduler.client.report [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.123 187287 DEBUG nova.compute.resource_tracker [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.123 187287 DEBUG oslo_concurrency.lockutils [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.128 187287 INFO nova.compute.manager [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.137 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.213 187287 INFO nova.scheduler.client.report [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Deleted allocation for migration 589749ad-91c9-44ff-9a44-6c1928321630#033[00m
Dec  3 09:37:27 np0005544118 nova_compute[187283]: 2025-12-03 14:37:27.214 187287 DEBUG nova.virt.libvirt.driver [None req-dabc314a-ffde-401c-9d69-15328454f0dd b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  3 09:37:28 np0005544118 systemd[1]: Stopping User Manager for UID 42436...
Dec  3 09:37:28 np0005544118 systemd[215632]: Activating special unit Exit the Session...
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped target Main User Target.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped target Basic System.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped target Paths.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped target Sockets.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped target Timers.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  3 09:37:28 np0005544118 systemd[215632]: Closed D-Bus User Message Bus Socket.
Dec  3 09:37:28 np0005544118 systemd[215632]: Stopped Create User's Volatile Files and Directories.
Dec  3 09:37:28 np0005544118 systemd[215632]: Removed slice User Application Slice.
Dec  3 09:37:28 np0005544118 systemd[215632]: Reached target Shutdown.
Dec  3 09:37:28 np0005544118 systemd[215632]: Finished Exit the Session.
Dec  3 09:37:28 np0005544118 systemd[215632]: Reached target Exit the Session.
Dec  3 09:37:28 np0005544118 systemd[1]: user@42436.service: Deactivated successfully.
Dec  3 09:37:28 np0005544118 systemd[1]: Stopped User Manager for UID 42436.
Dec  3 09:37:28 np0005544118 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  3 09:37:28 np0005544118 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  3 09:37:28 np0005544118 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  3 09:37:28 np0005544118 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  3 09:37:28 np0005544118 systemd[1]: Removed slice User Slice of UID 42436.
Dec  3 09:37:29 np0005544118 nova_compute[187283]: 2025-12-03 14:37:29.943 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:30 np0005544118 nova_compute[187283]: 2025-12-03 14:37:30.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:30 np0005544118 nova_compute[187283]: 2025-12-03 14:37:30.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:37:30 np0005544118 podman[215786]: 2025-12-03 14:37:30.84934233 +0000 UTC m=+0.081305880 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec  3 09:37:31 np0005544118 nova_compute[187283]: 2025-12-03 14:37:31.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:37:32 np0005544118 nova_compute[187283]: 2025-12-03 14:37:32.140 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:33 np0005544118 nova_compute[187283]: 2025-12-03 14:37:33.155 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:33.155 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:37:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:33.156 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:37:34 np0005544118 nova_compute[187283]: 2025-12-03 14:37:34.945 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:35 np0005544118 podman[197639]: time="2025-12-03T14:37:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:37:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:37:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:37:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:37:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Dec  3 09:37:35 np0005544118 nova_compute[187283]: 2025-12-03 14:37:35.960 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772640.9590447, ae37f28e-8d34-4805-a71f-6dd09be662f5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:37:35 np0005544118 nova_compute[187283]: 2025-12-03 14:37:35.960 187287 INFO nova.compute.manager [-] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:37:35 np0005544118 nova_compute[187283]: 2025-12-03 14:37:35.984 187287 DEBUG nova.compute.manager [None req-2e466434-ff79-46bc-af07-3398fcc69dd3 - - - - - -] [instance: ae37f28e-8d34-4805-a71f-6dd09be662f5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:37:37 np0005544118 nova_compute[187283]: 2025-12-03 14:37:37.142 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:37:39.158 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:37:39 np0005544118 nova_compute[187283]: 2025-12-03 14:37:39.997 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:40 np0005544118 podman[215812]: 2025-12-03 14:37:40.817675843 +0000 UTC m=+0.053990367 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public)
Dec  3 09:37:42 np0005544118 nova_compute[187283]: 2025-12-03 14:37:42.208 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:43 np0005544118 podman[215834]: 2025-12-03 14:37:43.825420024 +0000 UTC m=+0.057508824 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:37:45 np0005544118 nova_compute[187283]: 2025-12-03 14:37:44.999 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:47 np0005544118 nova_compute[187283]: 2025-12-03 14:37:47.210 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:37:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:37:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:37:50 np0005544118 nova_compute[187283]: 2025-12-03 14:37:50.002 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:52 np0005544118 nova_compute[187283]: 2025-12-03 14:37:52.212 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:53 np0005544118 podman[215854]: 2025-12-03 14:37:53.816190514 +0000 UTC m=+0.048837847 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:37:55 np0005544118 nova_compute[187283]: 2025-12-03 14:37:55.003 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:37:56 np0005544118 podman[215875]: 2025-12-03 14:37:56.857594124 +0000 UTC m=+0.082970696 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:37:57 np0005544118 nova_compute[187283]: 2025-12-03 14:37:57.215 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:00 np0005544118 nova_compute[187283]: 2025-12-03 14:38:00.006 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:00.969 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:00.969 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:00.969 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:01 np0005544118 podman[215899]: 2025-12-03 14:38:01.872526152 +0000 UTC m=+0.094911055 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:38:02 np0005544118 nova_compute[187283]: 2025-12-03 14:38:02.217 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:05 np0005544118 nova_compute[187283]: 2025-12-03 14:38:05.006 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:05 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:05Z|00174|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:38:05 np0005544118 podman[197639]: time="2025-12-03T14:38:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:38:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:38:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:38:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:38:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2593 "" "Go-http-client/1.1"
Dec  3 09:38:07 np0005544118 nova_compute[187283]: 2025-12-03 14:38:07.220 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:10 np0005544118 nova_compute[187283]: 2025-12-03 14:38:10.007 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.398 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.398 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.530 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.721 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.721 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.729 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:38:11 np0005544118 nova_compute[187283]: 2025-12-03 14:38:11.729 187287 INFO nova.compute.claims [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:38:11 np0005544118 podman[215926]: 2025-12-03 14:38:11.8164493 +0000 UTC m=+0.053447583 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git)
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.222 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.416 187287 DEBUG nova.compute.provider_tree [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.468 187287 DEBUG nova.scheduler.client.report [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.487 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.487 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.577 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.578 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.686 187287 INFO nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.736 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:38:12 np0005544118 nova_compute[187283]: 2025-12-03 14:38:12.759 187287 DEBUG nova.policy [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.159 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.160 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.161 187287 INFO nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Creating image(s)#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.161 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.161 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.162 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.177 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.241 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.242 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.243 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.255 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.318 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.319 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.353 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.354 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.355 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.411 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.412 187287 DEBUG nova.virt.disk.api [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.413 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.467 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.467 187287 DEBUG nova.virt.disk.api [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.468 187287 DEBUG nova.objects.instance [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid 973264bb-ebfb-4c89-a78d-154cf5c6c4ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.483 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.483 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Ensure instance console log exists: /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.484 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.484 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.484 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.672 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:13.673 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:38:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:13.674 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:38:13 np0005544118 nova_compute[187283]: 2025-12-03 14:38:13.782 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Successfully created port: c819577b-9486-4fec-bb14-cb1c2713cb60 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:38:14 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:14.676 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.739 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Successfully updated port: c819577b-9486-4fec-bb14-cb1c2713cb60 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.755 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.756 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.756 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:38:14 np0005544118 podman[215962]: 2025-12-03 14:38:14.819645718 +0000 UTC m=+0.054911713 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.839 187287 DEBUG nova.compute.manager [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-changed-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.840 187287 DEBUG nova.compute.manager [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Refreshing instance network info cache due to event network-changed-c819577b-9486-4fec-bb14-cb1c2713cb60. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.840 187287 DEBUG oslo_concurrency.lockutils [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:38:14 np0005544118 nova_compute[187283]: 2025-12-03 14:38:14.897 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.009 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.757 187287 DEBUG nova.network.neutron [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updating instance_info_cache with network_info: [{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.780 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.781 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Instance network_info: |[{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.782 187287 DEBUG oslo_concurrency.lockutils [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.782 187287 DEBUG nova.network.neutron [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Refreshing network info cache for port c819577b-9486-4fec-bb14-cb1c2713cb60 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.786 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Start _get_guest_xml network_info=[{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.790 187287 WARNING nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.795 187287 DEBUG nova.virt.libvirt.host [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.796 187287 DEBUG nova.virt.libvirt.host [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.800 187287 DEBUG nova.virt.libvirt.host [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.801 187287 DEBUG nova.virt.libvirt.host [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.803 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.803 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.803 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.804 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.804 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.804 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.804 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.805 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.805 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.805 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.805 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.806 187287 DEBUG nova.virt.hardware [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.810 187287 DEBUG nova.virt.libvirt.vif [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:38:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1306738674',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1306738674',id=19,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-144el27n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:38:12Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=973264bb-ebfb-4c89-a78d-154cf5c6c4ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.811 187287 DEBUG nova.network.os_vif_util [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.811 187287 DEBUG nova.network.os_vif_util [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.812 187287 DEBUG nova.objects.instance [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid 973264bb-ebfb-4c89-a78d-154cf5c6c4ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.828 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <uuid>973264bb-ebfb-4c89-a78d-154cf5c6c4ac</uuid>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <name>instance-00000013</name>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-1306738674</nova:name>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:38:15</nova:creationTime>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        <nova:port uuid="c819577b-9486-4fec-bb14-cb1c2713cb60">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="serial">973264bb-ebfb-4c89-a78d-154cf5c6c4ac</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="uuid">973264bb-ebfb-4c89-a78d-154cf5c6c4ac</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.config"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:d6:5a:70"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <target dev="tapc819577b-94"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/console.log" append="off"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:38:15 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:38:15 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:38:15 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:38:15 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.830 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Preparing to wait for external event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.830 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.830 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.831 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.831 187287 DEBUG nova.virt.libvirt.vif [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:38:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1306738674',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1306738674',id=19,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-144el27n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:38:12Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=973264bb-ebfb-4c89-a78d-154cf5c6c4ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.832 187287 DEBUG nova.network.os_vif_util [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.832 187287 DEBUG nova.network.os_vif_util [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.833 187287 DEBUG os_vif [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.833 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.834 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.834 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.837 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.838 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc819577b-94, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.839 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc819577b-94, col_values=(('external_ids', {'iface-id': 'c819577b-9486-4fec-bb14-cb1c2713cb60', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:5a:70', 'vm-uuid': '973264bb-ebfb-4c89-a78d-154cf5c6c4ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.840 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:15 np0005544118 NetworkManager[55710]: <info>  [1764772695.8415] manager: (tapc819577b-94): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.842 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.848 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.850 187287 INFO os_vif [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94')#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.912 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.912 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.912 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:d6:5a:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:38:15 np0005544118 nova_compute[187283]: 2025-12-03 14:38:15.913 187287 INFO nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Using config drive#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.242 187287 INFO nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Creating config drive at /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.config#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.247 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuf4pjpai execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.375 187287 DEBUG oslo_concurrency.processutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuf4pjpai" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:16 np0005544118 kernel: tapc819577b-94: entered promiscuous mode
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.4321] manager: (tapc819577b-94): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.432 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:16 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:16Z|00175|binding|INFO|Claiming lport c819577b-9486-4fec-bb14-cb1c2713cb60 for this chassis.
Dec  3 09:38:16 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:16Z|00176|binding|INFO|c819577b-9486-4fec-bb14-cb1c2713cb60: Claiming fa:16:3e:d6:5a:70 10.100.0.14
Dec  3 09:38:16 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:16Z|00177|binding|INFO|Setting lport c819577b-9486-4fec-bb14-cb1c2713cb60 ovn-installed in OVS
Dec  3 09:38:16 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:16Z|00178|binding|INFO|Setting lport c819577b-9486-4fec-bb14-cb1c2713cb60 up in Southbound
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.447 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:5a:70 10.100.0.14'], port_security=['fa:16:3e:d6:5a:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '973264bb-ebfb-4c89-a78d-154cf5c6c4ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=c819577b-9486-4fec-bb14-cb1c2713cb60) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.449 104491 INFO neutron.agent.ovn.metadata.agent [-] Port c819577b-9486-4fec-bb14-cb1c2713cb60 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.450 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.450 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:38:16 np0005544118 systemd-udevd[216002]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.464 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c891f7ad-03c9-44b0-acc7-06c31a578da0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.464 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.466 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.466 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[210c9da7-ee85-43a3-9e94-9ca2474d95bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.467 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e5d3e99a-bd19-4ba9-8a35-bbdfb5bb131f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 systemd-machined[153602]: New machine qemu-16-instance-00000013.
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.4773] device (tapc819577b-94): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.4782] device (tapc819577b-94): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.479 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[e3dc3ce9-1414-4909-8cf3-dbd4d4bbdc4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 systemd[1]: Started Virtual Machine qemu-16-instance-00000013.
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.494 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c12d4b-7baa-4a1c-b07e-cc4115a31e39]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.520 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[5df44f28-244b-4c48-aa8e-6d590201fb98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.525 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe1693e-ead3-42de-88e0-733a402d30d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.5266] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/70)
Dec  3 09:38:16 np0005544118 systemd-udevd[216006]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.555 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[36602828-7614-40ec-8276-c5b35dea0a4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.558 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c1ea33-e8df-4e95-8dd1-0d963dc4b1f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.5796] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.584 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[9f142d41-76b0-4296-b49b-b369074b1076]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.599 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4077b86b-e0e9-4ee8-ad07-3a7554000e9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485218, 'reachable_time': 34451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216035, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.616 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca97f70-fb7a-4a4c-8c7a-377bf9373430]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485218, 'tstamp': 485218}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216036, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.632 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[daf4833b-bfc7-4fa7-a97f-44c4335f9825]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485218, 'reachable_time': 34451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216039, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.662 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3c83b53e-7f98-4115-85b5-0c42e3b0f41b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.723 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a2982a51-7b7d-4827-b120-f5fedcf9c60d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.724 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.724 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.725 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:16 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:38:16 np0005544118 NetworkManager[55710]: <info>  [1764772696.7632] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.763 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.768 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.770 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:16 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:16Z|00179|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.773 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.774 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7326456d-06bf-4901-8035-02bb2d68d869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.775 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.776 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772696.7760928, 973264bb-ebfb-4c89-a78d-154cf5c6c4ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.777 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] VM Started (Lifecycle Event)#033[00m
Dec  3 09:38:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:38:16.777 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:38:16 np0005544118 nova_compute[187283]: 2025-12-03 14:38:16.782 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.035 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.039 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772696.7763193, 973264bb-ebfb-4c89-a78d-154cf5c6c4ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.040 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.062 187287 DEBUG nova.compute.manager [req-a72ca4bb-c85f-4f80-910f-b88372f9b8f9 req-01f9b5df-6554-410a-b35d-d602b45f6ef2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.062 187287 DEBUG oslo_concurrency.lockutils [req-a72ca4bb-c85f-4f80-910f-b88372f9b8f9 req-01f9b5df-6554-410a-b35d-d602b45f6ef2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.063 187287 DEBUG oslo_concurrency.lockutils [req-a72ca4bb-c85f-4f80-910f-b88372f9b8f9 req-01f9b5df-6554-410a-b35d-d602b45f6ef2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.063 187287 DEBUG oslo_concurrency.lockutils [req-a72ca4bb-c85f-4f80-910f-b88372f9b8f9 req-01f9b5df-6554-410a-b35d-d602b45f6ef2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.063 187287 DEBUG nova.compute.manager [req-a72ca4bb-c85f-4f80-910f-b88372f9b8f9 req-01f9b5df-6554-410a-b35d-d602b45f6ef2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Processing event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.064 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.070 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.073 187287 INFO nova.virt.libvirt.driver [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Instance spawned successfully.#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.074 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.128 187287 DEBUG nova.network.neutron [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updated VIF entry in instance network info cache for port c819577b-9486-4fec-bb14-cb1c2713cb60. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.129 187287 DEBUG nova.network.neutron [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updating instance_info_cache with network_info: [{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:38:17 np0005544118 podman[216074]: 2025-12-03 14:38:17.146108707 +0000 UTC m=+0.047598913 container create 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.157 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.162 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772697.0671463, 973264bb-ebfb-4c89-a78d-154cf5c6c4ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.162 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:38:17 np0005544118 systemd[1]: Started libpod-conmon-81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54.scope.
Dec  3 09:38:17 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:38:17 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d46808484c77550e109e1fc199393a30737ce2e9b3a6dfbfb2e71b80d6b93661/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:38:17 np0005544118 podman[216074]: 2025-12-03 14:38:17.119473853 +0000 UTC m=+0.020964079 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:38:17 np0005544118 podman[216074]: 2025-12-03 14:38:17.222159432 +0000 UTC m=+0.123649658 container init 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:38:17 np0005544118 podman[216074]: 2025-12-03 14:38:17.22752272 +0000 UTC m=+0.129012926 container start 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Dec  3 09:38:17 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [NOTICE]   (216093) : New worker (216095) forked
Dec  3 09:38:17 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [NOTICE]   (216093) : Loading success.
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.476 187287 DEBUG oslo_concurrency.lockutils [req-d5f3ebbe-0c46-4328-8536-cfe52408fb5a req-e42f29ae-e55a-4c3e-9f9a-0797f6b3840b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.483 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.484 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.484 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.484 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.485 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.485 187287 DEBUG nova.virt.libvirt.driver [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.528 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.531 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.573 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.634 187287 INFO nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Took 4.48 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.635 187287 DEBUG nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:38:17 np0005544118 nova_compute[187283]: 2025-12-03 14:38:17.784 187287 INFO nova.compute.manager [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Took 6.09 seconds to build instance.#033[00m
Dec  3 09:38:18 np0005544118 nova_compute[187283]: 2025-12-03 14:38:18.001 187287 DEBUG oslo_concurrency.lockutils [None req-5b30b702-8dbb-4bee-8803-8ae14b7bbfb3 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:18 np0005544118 nova_compute[187283]: 2025-12-03 14:38:18.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.167 187287 DEBUG nova.compute.manager [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.168 187287 DEBUG oslo_concurrency.lockutils [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.168 187287 DEBUG oslo_concurrency.lockutils [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.168 187287 DEBUG oslo_concurrency.lockutils [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.169 187287 DEBUG nova.compute.manager [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] No waiting events found dispatching network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.169 187287 WARNING nova.compute.manager [req-ea891280-a865-47af-be3b-f392ad5d3472 req-adadc855-ca88-4aed-ba43-8fda4db1b85a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received unexpected event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:38:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:38:19 np0005544118 nova_compute[187283]: 2025-12-03 14:38:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:20 np0005544118 nova_compute[187283]: 2025-12-03 14:38:20.011 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:20 np0005544118 nova_compute[187283]: 2025-12-03 14:38:20.841 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:21 np0005544118 nova_compute[187283]: 2025-12-03 14:38:21.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:23 np0005544118 nova_compute[187283]: 2025-12-03 14:38:23.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:23 np0005544118 nova_compute[187283]: 2025-12-03 14:38:23.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.764 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.765 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.765 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:38:24 np0005544118 nova_compute[187283]: 2025-12-03 14:38:24.765 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 973264bb-ebfb-4c89-a78d-154cf5c6c4ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:38:24 np0005544118 podman[216104]: 2025-12-03 14:38:24.818226212 +0000 UTC m=+0.053210806 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec  3 09:38:25 np0005544118 nova_compute[187283]: 2025-12-03 14:38:25.013 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:25 np0005544118 nova_compute[187283]: 2025-12-03 14:38:25.843 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.171 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updating instance_info_cache with network_info: [{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.192 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.192 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.192 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.213 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.213 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.214 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.214 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.276 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.337 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.338 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.394 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.546 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.548 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5728MB free_disk=73.33488845825195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.548 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.548 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 973264bb-ebfb-4c89-a78d-154cf5c6c4ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.716 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.735 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.758 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:38:26 np0005544118 nova_compute[187283]: 2025-12-03 14:38:26.759 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:38:27 np0005544118 podman[216129]: 2025-12-03 14:38:27.841064151 +0000 UTC m=+0.060583209 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:38:29 np0005544118 nova_compute[187283]: 2025-12-03 14:38:29.755 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:30 np0005544118 nova_compute[187283]: 2025-12-03 14:38:30.018 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:30 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:30Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:5a:70 10.100.0.14
Dec  3 09:38:30 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:30Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:5a:70 10.100.0.14
Dec  3 09:38:30 np0005544118 nova_compute[187283]: 2025-12-03 14:38:30.845 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:32 np0005544118 nova_compute[187283]: 2025-12-03 14:38:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:38:32 np0005544118 nova_compute[187283]: 2025-12-03 14:38:32.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:38:32 np0005544118 podman[216161]: 2025-12-03 14:38:32.884810092 +0000 UTC m=+0.117041114 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:38:35 np0005544118 nova_compute[187283]: 2025-12-03 14:38:35.019 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:35 np0005544118 podman[197639]: time="2025-12-03T14:38:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:38:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:38:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:38:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:38:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3055 "" "Go-http-client/1.1"
Dec  3 09:38:35 np0005544118 nova_compute[187283]: 2025-12-03 14:38:35.847 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:40 np0005544118 nova_compute[187283]: 2025-12-03 14:38:40.021 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:40 np0005544118 nova_compute[187283]: 2025-12-03 14:38:40.849 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:42 np0005544118 podman[216191]: 2025-12-03 14:38:42.820464423 +0000 UTC m=+0.054689758 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:38:45 np0005544118 nova_compute[187283]: 2025-12-03 14:38:45.066 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:45 np0005544118 podman[216212]: 2025-12-03 14:38:45.827367974 +0000 UTC m=+0.058111702 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:38:45 np0005544118 nova_compute[187283]: 2025-12-03 14:38:45.851 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:38:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:38:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:38:50 np0005544118 nova_compute[187283]: 2025-12-03 14:38:50.067 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:50 np0005544118 nova_compute[187283]: 2025-12-03 14:38:50.852 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:55 np0005544118 nova_compute[187283]: 2025-12-03 14:38:55.075 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:55 np0005544118 ovn_controller[95637]: 2025-12-03T14:38:55Z|00180|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:38:55 np0005544118 podman[216233]: 2025-12-03 14:38:55.821878917 +0000 UTC m=+0.054072299 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:38:55 np0005544118 nova_compute[187283]: 2025-12-03 14:38:55.855 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:38:58 np0005544118 podman[216252]: 2025-12-03 14:38:58.810322089 +0000 UTC m=+0.046423230 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:39:00 np0005544118 nova_compute[187283]: 2025-12-03 14:39:00.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:00 np0005544118 nova_compute[187283]: 2025-12-03 14:39:00.857 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:00.970 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:00.971 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:00.972 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:03 np0005544118 podman[216278]: 2025-12-03 14:39:03.85367362 +0000 UTC m=+0.080707885 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Dec  3 09:39:05 np0005544118 nova_compute[187283]: 2025-12-03 14:39:05.080 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:05 np0005544118 podman[197639]: time="2025-12-03T14:39:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:39:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:39:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:39:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:39:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Dec  3 09:39:05 np0005544118 nova_compute[187283]: 2025-12-03 14:39:05.859 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:10 np0005544118 nova_compute[187283]: 2025-12-03 14:39:10.082 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:10 np0005544118 nova_compute[187283]: 2025-12-03 14:39:10.862 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.020 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Creating tmpfile /var/lib/nova/instances/tmplyqv7nnm to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.021 187287 DEBUG nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplyqv7nnm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:39:13 np0005544118 podman[216304]: 2025-12-03 14:39:13.817823003 +0000 UTC m=+0.052337512 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.855 187287 DEBUG nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplyqv7nnm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bb687bfa-6895-44f9-91e0-0ca8de621adf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.877 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.878 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:39:13 np0005544118 nova_compute[187283]: 2025-12-03 14:39:13.878 187287 DEBUG nova.network.neutron [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.710 187287 DEBUG nova.network.neutron [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Updating instance_info_cache with network_info: [{"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.738 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.741 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplyqv7nnm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bb687bfa-6895-44f9-91e0-0ca8de621adf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.741 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Creating instance directory: /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.742 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Creating disk.info with the contents: {'/var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk': 'qcow2', '/var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.742 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.743 187287 DEBUG nova.objects.instance [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid bb687bfa-6895-44f9-91e0-0ca8de621adf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.768 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.829 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.831 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.832 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.848 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.911 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.913 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.952 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.954 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:14 np0005544118 nova_compute[187283]: 2025-12-03 14:39:14.954 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.017 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.019 187287 DEBUG nova.virt.disk.api [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.020 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.084 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.087 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.088 187287 DEBUG nova.virt.disk.api [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.089 187287 DEBUG nova.objects.instance [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid bb687bfa-6895-44f9-91e0-0ca8de621adf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.108 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.138 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config 485376" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.142 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config to /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.142 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.671 187287 DEBUG oslo_concurrency.processutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk.config /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.672 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.674 187287 DEBUG nova.virt.libvirt.vif [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:38:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-131474212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-131474212',id=20,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:38:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-1xfv0uav',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:38:28Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=bb687bfa-6895-44f9-91e0-0ca8de621adf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.674 187287 DEBUG nova.network.os_vif_util [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.676 187287 DEBUG nova.network.os_vif_util [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.676 187287 DEBUG os_vif [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.677 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.677 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.678 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.686 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.687 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8ff9415d-a1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.687 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8ff9415d-a1, col_values=(('external_ids', {'iface-id': '8ff9415d-a109-43c3-9169-889db9318c66', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:30:2e:25', 'vm-uuid': 'bb687bfa-6895-44f9-91e0-0ca8de621adf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.689 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:15 np0005544118 NetworkManager[55710]: <info>  [1764772755.6912] manager: (tap8ff9415d-a1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.692 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.697 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.700 187287 INFO os_vif [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1')#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.701 187287 DEBUG nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:39:15 np0005544118 nova_compute[187283]: 2025-12-03 14:39:15.701 187287 DEBUG nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplyqv7nnm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bb687bfa-6895-44f9-91e0-0ca8de621adf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:39:16 np0005544118 podman[216356]: 2025-12-03 14:39:16.813481925 +0000 UTC m=+0.048368184 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  3 09:39:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:17.753 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:39:17 np0005544118 nova_compute[187283]: 2025-12-03 14:39:17.753 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:17.754 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:39:18 np0005544118 nova_compute[187283]: 2025-12-03 14:39:18.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:18 np0005544118 nova_compute[187283]: 2025-12-03 14:39:18.818 187287 DEBUG nova.network.neutron [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Port 8ff9415d-a109-43c3-9169-889db9318c66 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:39:18 np0005544118 nova_compute[187283]: 2025-12-03 14:39:18.819 187287 DEBUG nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplyqv7nnm',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='bb687bfa-6895-44f9-91e0-0ca8de621adf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:39:18 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:39:18 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:39:19 np0005544118 kernel: tap8ff9415d-a1: entered promiscuous mode
Dec  3 09:39:19 np0005544118 NetworkManager[55710]: <info>  [1764772759.1421] manager: (tap8ff9415d-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Dec  3 09:39:19 np0005544118 nova_compute[187283]: 2025-12-03 14:39:19.142 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:19Z|00181|binding|INFO|Claiming lport 8ff9415d-a109-43c3-9169-889db9318c66 for this additional chassis.
Dec  3 09:39:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:19Z|00182|binding|INFO|8ff9415d-a109-43c3-9169-889db9318c66: Claiming fa:16:3e:30:2e:25 10.100.0.12
Dec  3 09:39:19 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:19Z|00183|binding|INFO|Setting lport 8ff9415d-a109-43c3-9169-889db9318c66 ovn-installed in OVS
Dec  3 09:39:19 np0005544118 nova_compute[187283]: 2025-12-03 14:39:19.155 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:19 np0005544118 nova_compute[187283]: 2025-12-03 14:39:19.158 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:19 np0005544118 systemd-udevd[216410]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:39:19 np0005544118 systemd-machined[153602]: New machine qemu-17-instance-00000014.
Dec  3 09:39:19 np0005544118 NetworkManager[55710]: <info>  [1764772759.1856] device (tap8ff9415d-a1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:39:19 np0005544118 NetworkManager[55710]: <info>  [1764772759.1867] device (tap8ff9415d-a1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:39:19 np0005544118 systemd[1]: Started Virtual Machine qemu-17-instance-00000014.
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:39:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:39:19 np0005544118 nova_compute[187283]: 2025-12-03 14:39:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:19 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:19.757 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:20 np0005544118 nova_compute[187283]: 2025-12-03 14:39:20.086 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:20 np0005544118 nova_compute[187283]: 2025-12-03 14:39:20.459 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772760.459446, bb687bfa-6895-44f9-91e0-0ca8de621adf => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:39:20 np0005544118 nova_compute[187283]: 2025-12-03 14:39:20.460 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] VM Started (Lifecycle Event)#033[00m
Dec  3 09:39:20 np0005544118 nova_compute[187283]: 2025-12-03 14:39:20.482 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:39:20 np0005544118 nova_compute[187283]: 2025-12-03 14:39:20.690 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:21 np0005544118 nova_compute[187283]: 2025-12-03 14:39:21.347 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772761.3470588, bb687bfa-6895-44f9-91e0-0ca8de621adf => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:39:21 np0005544118 nova_compute[187283]: 2025-12-03 14:39:21.348 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:39:21 np0005544118 nova_compute[187283]: 2025-12-03 14:39:21.368 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:39:21 np0005544118 nova_compute[187283]: 2025-12-03 14:39:21.371 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:39:21 np0005544118 nova_compute[187283]: 2025-12-03 14:39:21.390 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:39:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:22Z|00184|binding|INFO|Claiming lport 8ff9415d-a109-43c3-9169-889db9318c66 for this chassis.
Dec  3 09:39:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:22Z|00185|binding|INFO|8ff9415d-a109-43c3-9169-889db9318c66: Claiming fa:16:3e:30:2e:25 10.100.0.12
Dec  3 09:39:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:22Z|00186|binding|INFO|Setting lport 8ff9415d-a109-43c3-9169-889db9318c66 up in Southbound
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.254 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:2e:25 10.100.0.12'], port_security=['fa:16:3e:30:2e:25 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bb687bfa-6895-44f9-91e0-0ca8de621adf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=8ff9415d-a109-43c3-9169-889db9318c66) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.255 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 8ff9415d-a109-43c3-9169-889db9318c66 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.256 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.273 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cad154a6-02f3-4ea4-9fc7-9243970b7a26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.304 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[76c49075-676a-459f-95fd-b07d1b43be3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.308 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[6807fcf9-a8b6-4a1e-aa13-806720c9e1e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.331 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[2b9a1bf5-6aef-467e-ad5d-8f2abac8b5e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.347 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4f31ee9f-d0b8-4474-9c95-b7467d533c64]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485218, 'reachable_time': 34451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216435, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.362 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9e917c93-cdf9-40f0-ab11-0195dbab44ff]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485229, 'tstamp': 485229}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216436, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485232, 'tstamp': 485232}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216436, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.365 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:22 np0005544118 nova_compute[187283]: 2025-12-03 14:39:22.367 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.368 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.368 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.369 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:22.369 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:39:22 np0005544118 nova_compute[187283]: 2025-12-03 14:39:22.422 187287 INFO nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Post operation of migration started#033[00m
Dec  3 09:39:22 np0005544118 nova_compute[187283]: 2025-12-03 14:39:22.627 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:39:22 np0005544118 nova_compute[187283]: 2025-12-03 14:39:22.628 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:39:22 np0005544118 nova_compute[187283]: 2025-12-03 14:39:22.628 187287 DEBUG nova.network.neutron [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.849 187287 DEBUG nova.network.neutron [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Updating instance_info_cache with network_info: [{"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.871 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-bb687bfa-6895-44f9-91e0-0ca8de621adf" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.888 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.889 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.889 187287 DEBUG oslo_concurrency.lockutils [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:23 np0005544118 nova_compute[187283]: 2025-12-03 14:39:23.894 187287 INFO nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:39:23 np0005544118 virtqemud[186958]: Domain id=17 name='instance-00000014' uuid=bb687bfa-6895-44f9-91e0-0ca8de621adf is tainted: custom-monitor
Dec  3 09:39:24 np0005544118 nova_compute[187283]: 2025-12-03 14:39:24.900 187287 INFO nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.088 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.691 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.906 187287 INFO nova.virt.libvirt.driver [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.912 187287 DEBUG nova.compute.manager [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:39:25 np0005544118 nova_compute[187283]: 2025-12-03 14:39:25.938 187287 DEBUG nova.objects.instance [None req-134510a2-db9d-4b30-b7e2-343fcfaec6b9 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.786 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.787 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.787 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:39:26 np0005544118 nova_compute[187283]: 2025-12-03 14:39:26.788 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 973264bb-ebfb-4c89-a78d-154cf5c6c4ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:39:26 np0005544118 podman[216437]: 2025-12-03 14:39:26.836296977 +0000 UTC m=+0.070349028 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.057 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updating instance_info_cache with network_info: [{"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.072 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-973264bb-ebfb-4c89-a78d-154cf5c6c4ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.073 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.073 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.094 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.094 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.095 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.095 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.156 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.212 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.213 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.271 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.277 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.338 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.339 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.394 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.542 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.543 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5508MB free_disk=73.27649307250977GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.543 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.544 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 973264bb-ebfb-4c89-a78d-154cf5c6c4ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.629 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance bb687bfa-6895-44f9-91e0-0ca8de621adf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.629 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.629 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.643 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.659 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.659 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.673 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.696 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.750 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.763 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.782 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:39:28 np0005544118 nova_compute[187283]: 2025-12-03 14:39:28.782 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:29 np0005544118 podman[216471]: 2025-12-03 14:39:29.827511095 +0000 UTC m=+0.054464101 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.121 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.632 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "bb687bfa-6895-44f9-91e0-0ca8de621adf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.633 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.633 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.633 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.633 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.634 187287 INFO nova.compute.manager [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Terminating instance#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.635 187287 DEBUG nova.compute.manager [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.692 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 kernel: tap8ff9415d-a1 (unregistering): left promiscuous mode
Dec  3 09:39:30 np0005544118 NetworkManager[55710]: <info>  [1764772770.7337] device (tap8ff9415d-a1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.738 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:30Z|00187|binding|INFO|Releasing lport 8ff9415d-a109-43c3-9169-889db9318c66 from this chassis (sb_readonly=0)
Dec  3 09:39:30 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:30Z|00188|binding|INFO|Setting lport 8ff9415d-a109-43c3-9169-889db9318c66 down in Southbound
Dec  3 09:39:30 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:30Z|00189|binding|INFO|Removing iface tap8ff9415d-a1 ovn-installed in OVS
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.740 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.750 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:30:2e:25 10.100.0.12'], port_security=['fa:16:3e:30:2e:25 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'bb687bfa-6895-44f9-91e0-0ca8de621adf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=8ff9415d-a109-43c3-9169-889db9318c66) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.751 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 8ff9415d-a109-43c3-9169-889db9318c66 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.752 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.755 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.767 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d1867aa0-499c-43ad-bca6-873743c97ce1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.778 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:30 np0005544118 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000014.scope: Deactivated successfully.
Dec  3 09:39:30 np0005544118 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000014.scope: Consumed 2.123s CPU time.
Dec  3 09:39:30 np0005544118 systemd-machined[153602]: Machine qemu-17-instance-00000014 terminated.
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.794 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[249b18bf-7420-4223-ae69-0d7e1260acc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.797 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[da44c856-07a7-4d12-9d0d-17be363d14eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.825 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[deda21c5-a4ea-4bd2-a583-98cd3b9aa1e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.842 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[14d8d9b1-9d6b-401a-af62-adbc69f10b4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485218, 'reachable_time': 34451, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216508, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 NetworkManager[55710]: <info>  [1764772770.8598] manager: (tap8ff9415d-a1): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.859 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3eded8-ad5c-4c68-a5f2-a5060d111093]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485229, 'tstamp': 485229}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216510, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485232, 'tstamp': 485232}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216510, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.860 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.861 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.863 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.867 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.871 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.872 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.873 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.873 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:30 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:30.873 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.901 187287 INFO nova.virt.libvirt.driver [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Instance destroyed successfully.#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.902 187287 DEBUG nova.objects.instance [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid bb687bfa-6895-44f9-91e0-0ca8de621adf obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.914 187287 DEBUG nova.virt.libvirt.vif [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:38:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-131474212',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-131474212',id=20,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:38:28Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-1xfv0uav',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:39:25Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=bb687bfa-6895-44f9-91e0-0ca8de621adf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.915 187287 DEBUG nova.network.os_vif_util [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "8ff9415d-a109-43c3-9169-889db9318c66", "address": "fa:16:3e:30:2e:25", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8ff9415d-a1", "ovs_interfaceid": "8ff9415d-a109-43c3-9169-889db9318c66", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.916 187287 DEBUG nova.network.os_vif_util [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.916 187287 DEBUG os_vif [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.917 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.917 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8ff9415d-a1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.919 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.920 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.923 187287 INFO os_vif [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:30:2e:25,bridge_name='br-int',has_traffic_filtering=True,id=8ff9415d-a109-43c3-9169-889db9318c66,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8ff9415d-a1')#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.923 187287 INFO nova.virt.libvirt.driver [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Deleting instance files /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf_del#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.924 187287 INFO nova.virt.libvirt.driver [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Deletion of /var/lib/nova/instances/bb687bfa-6895-44f9-91e0-0ca8de621adf_del complete#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.964 187287 INFO nova.compute.manager [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.965 187287 DEBUG oslo.service.loopingcall [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.965 187287 DEBUG nova.compute.manager [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:39:30 np0005544118 nova_compute[187283]: 2025-12-03 14:39:30.965 187287 DEBUG nova.network.neutron [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.847 187287 DEBUG nova.compute.manager [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Received event network-vif-unplugged-8ff9415d-a109-43c3-9169-889db9318c66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.847 187287 DEBUG oslo_concurrency.lockutils [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.848 187287 DEBUG oslo_concurrency.lockutils [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.848 187287 DEBUG oslo_concurrency.lockutils [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.848 187287 DEBUG nova.compute.manager [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] No waiting events found dispatching network-vif-unplugged-8ff9415d-a109-43c3-9169-889db9318c66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:39:32 np0005544118 nova_compute[187283]: 2025-12-03 14:39:32.849 187287 DEBUG nova.compute.manager [req-2ed5afe6-3e40-46a6-9f88-ff94032c2ff3 req-82e18d35-1f79-4865-996c-3a899ec01bce c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Received event network-vif-unplugged-8ff9415d-a109-43c3-9169-889db9318c66 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:39:33 np0005544118 nova_compute[187283]: 2025-12-03 14:39:33.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:33 np0005544118 nova_compute[187283]: 2025-12-03 14:39:33.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:39:33 np0005544118 nova_compute[187283]: 2025-12-03 14:39:33.981 187287 DEBUG nova.network.neutron [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.005 187287 INFO nova.compute.manager [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Took 3.04 seconds to deallocate network for instance.#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.053 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.054 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.130 187287 DEBUG nova.compute.provider_tree [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.148 187287 DEBUG nova.scheduler.client.report [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.166 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.191 187287 INFO nova.scheduler.client.report [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance bb687bfa-6895-44f9-91e0-0ca8de621adf#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.458 187287 DEBUG oslo_concurrency.lockutils [None req-41eb616a-14e3-4c1b-a9bf-55f61e075c50 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:34 np0005544118 podman[216525]: 2025-12-03 14:39:34.840797188 +0000 UTC m=+0.071038268 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.921 187287 DEBUG nova.compute.manager [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Received event network-vif-deleted-8ff9415d-a109-43c3-9169-889db9318c66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.921 187287 DEBUG nova.compute.manager [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Received event network-vif-plugged-8ff9415d-a109-43c3-9169-889db9318c66 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.922 187287 DEBUG oslo_concurrency.lockutils [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.922 187287 DEBUG oslo_concurrency.lockutils [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.922 187287 DEBUG oslo_concurrency.lockutils [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "bb687bfa-6895-44f9-91e0-0ca8de621adf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.922 187287 DEBUG nova.compute.manager [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] No waiting events found dispatching network-vif-plugged-8ff9415d-a109-43c3-9169-889db9318c66 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:39:34 np0005544118 nova_compute[187283]: 2025-12-03 14:39:34.922 187287 WARNING nova.compute.manager [req-93c5ffce-9cfc-4561-895c-c0560e7eba29 req-98172c93-a681-422b-9ad8-aba16ca4e63e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Received unexpected event network-vif-plugged-8ff9415d-a109-43c3-9169-889db9318c66 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.322 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.323 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.323 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.323 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.323 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.324 187287 INFO nova.compute.manager [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Terminating instance#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.325 187287 DEBUG nova.compute.manager [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:39:35 np0005544118 kernel: tapc819577b-94 (unregistering): left promiscuous mode
Dec  3 09:39:35 np0005544118 NetworkManager[55710]: <info>  [1764772775.5088] device (tapc819577b-94): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:39:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:35Z|00190|binding|INFO|Releasing lport c819577b-9486-4fec-bb14-cb1c2713cb60 from this chassis (sb_readonly=0)
Dec  3 09:39:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:35Z|00191|binding|INFO|Setting lport c819577b-9486-4fec-bb14-cb1c2713cb60 down in Southbound
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.513 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:39:35Z|00192|binding|INFO|Removing iface tapc819577b-94 ovn-installed in OVS
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.514 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.528 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:35.545 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:5a:70 10.100.0.14'], port_security=['fa:16:3e:d6:5a:70 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '973264bb-ebfb-4c89-a78d-154cf5c6c4ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=c819577b-9486-4fec-bb14-cb1c2713cb60) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:39:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:35.546 104491 INFO neutron.agent.ovn.metadata.agent [-] Port c819577b-9486-4fec-bb14-cb1c2713cb60 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:39:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:35.547 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:39:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:35.548 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[39ac8b49-7a60-435e-8d09-1d31fb76918d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:35.549 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:39:35 np0005544118 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec  3 09:39:35 np0005544118 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000013.scope: Consumed 15.226s CPU time.
Dec  3 09:39:35 np0005544118 systemd-machined[153602]: Machine qemu-16-instance-00000013 terminated.
Dec  3 09:39:35 np0005544118 podman[197639]: time="2025-12-03T14:39:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:39:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:39:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:39:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:39:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3062 "" "Go-http-client/1.1"
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.785 187287 INFO nova.virt.libvirt.driver [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Instance destroyed successfully.#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.785 187287 DEBUG nova.objects.instance [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 973264bb-ebfb-4c89-a78d-154cf5c6c4ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.803 187287 DEBUG nova.virt.libvirt.vif [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:38:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1306738674',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1306738674',id=19,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:38:17Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-144el27n',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:38:17Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=973264bb-ebfb-4c89-a78d-154cf5c6c4ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.804 187287 DEBUG nova.network.os_vif_util [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "c819577b-9486-4fec-bb14-cb1c2713cb60", "address": "fa:16:3e:d6:5a:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc819577b-94", "ovs_interfaceid": "c819577b-9486-4fec-bb14-cb1c2713cb60", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.804 187287 DEBUG nova.network.os_vif_util [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.805 187287 DEBUG os_vif [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.806 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.806 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc819577b-94, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.807 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.809 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.810 187287 INFO os_vif [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=c819577b-9486-4fec-bb14-cb1c2713cb60,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc819577b-94')#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.811 187287 INFO nova.virt.libvirt.driver [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Deleting instance files /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac_del#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.811 187287 INFO nova.virt.libvirt.driver [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Deletion of /var/lib/nova/instances/973264bb-ebfb-4c89-a78d-154cf5c6c4ac_del complete#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.862 187287 INFO nova.compute.manager [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.862 187287 DEBUG oslo.service.loopingcall [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.864 187287 DEBUG nova.compute.manager [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:39:35 np0005544118 nova_compute[187283]: 2025-12-03 14:39:35.864 187287 DEBUG nova.network.neutron [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [NOTICE]   (216093) : haproxy version is 2.8.14-c23fe91
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [NOTICE]   (216093) : path to executable is /usr/sbin/haproxy
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [WARNING]  (216093) : Exiting Master process...
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [WARNING]  (216093) : Exiting Master process...
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [ALERT]    (216093) : Current worker (216095) exited with code 143 (Terminated)
Dec  3 09:39:36 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216089]: [WARNING]  (216093) : All workers exited. Exiting... (0)
Dec  3 09:39:36 np0005544118 systemd[1]: libpod-81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54.scope: Deactivated successfully.
Dec  3 09:39:36 np0005544118 podman[216577]: 2025-12-03 14:39:36.297037498 +0000 UTC m=+0.664046611 container died 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:39:36 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54-userdata-shm.mount: Deactivated successfully.
Dec  3 09:39:36 np0005544118 systemd[1]: var-lib-containers-storage-overlay-d46808484c77550e109e1fc199393a30737ce2e9b3a6dfbfb2e71b80d6b93661-merged.mount: Deactivated successfully.
Dec  3 09:39:36 np0005544118 podman[216577]: 2025-12-03 14:39:36.491051452 +0000 UTC m=+0.858060555 container cleanup 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:39:36 np0005544118 systemd[1]: libpod-conmon-81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54.scope: Deactivated successfully.
Dec  3 09:39:36 np0005544118 nova_compute[187283]: 2025-12-03 14:39:36.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:39:36 np0005544118 podman[216626]: 2025-12-03 14:39:36.962446855 +0000 UTC m=+0.452009141 container remove 81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:39:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:36.967 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d1a2bf9a-1eee-415f-8f0a-35c45c884f92]: (4, ('Wed Dec  3 02:39:35 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54)\n81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54\nWed Dec  3 02:39:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54)\n81e786bd64eb8ca04bbba697ef624f07fab670ca2ed7347672df54f728ce7b54\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:36.969 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[14d25d5d-fc0b-4bbf-8d3a-a022d3fb199d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:36.969 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:39:36 np0005544118 nova_compute[187283]: 2025-12-03 14:39:36.971 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:36 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:39:36 np0005544118 nova_compute[187283]: 2025-12-03 14:39:36.988 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:36 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:36.991 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[85da1ef9-b6d2-4e8c-9b6d-dff7384d3c72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:37.006 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcc5ede-a884-46d0-b789-bed0294cf42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:37.007 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e45e3898-a6e1-4864-81f3-c61c82d49418]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:37.023 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d1396104-2a4b-49e1-b325-40b5734e0948]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485212, 'reachable_time': 41094, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216642, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:37 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:39:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:37.029 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:39:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:39:37.029 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9d5ed1-18ff-46ef-b2f8-6f757dd62bfd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.373 187287 DEBUG nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-unplugged-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.374 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.374 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.374 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.375 187287 DEBUG nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] No waiting events found dispatching network-vif-unplugged-c819577b-9486-4fec-bb14-cb1c2713cb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.375 187287 DEBUG nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-unplugged-c819577b-9486-4fec-bb14-cb1c2713cb60 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.375 187287 DEBUG nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.376 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.376 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.376 187287 DEBUG oslo_concurrency.lockutils [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.377 187287 DEBUG nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] No waiting events found dispatching network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.377 187287 WARNING nova.compute.manager [req-08e76111-05cf-408c-b35f-0af34c3031a0 req-bbe5d603-7652-46e6-a2d2-a609f01d0fb3 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received unexpected event network-vif-plugged-c819577b-9486-4fec-bb14-cb1c2713cb60 for instance with vm_state active and task_state deleting.#033[00m
Dec  3 09:39:37 np0005544118 nova_compute[187283]: 2025-12-03 14:39:37.964 187287 DEBUG nova.network.neutron [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.092 187287 INFO nova.compute.manager [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Took 2.23 seconds to deallocate network for instance.#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.190 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.191 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.232 187287 DEBUG nova.compute.provider_tree [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.281 187287 DEBUG nova.scheduler.client.report [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.462 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.271s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.535 187287 INFO nova.scheduler.client.report [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 973264bb-ebfb-4c89-a78d-154cf5c6c4ac#033[00m
Dec  3 09:39:38 np0005544118 nova_compute[187283]: 2025-12-03 14:39:38.759 187287 DEBUG oslo_concurrency.lockutils [None req-2a3b05c8-257f-4991-a4a4-080b2ae1d427 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "973264bb-ebfb-4c89-a78d-154cf5c6c4ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.436s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:39:39 np0005544118 nova_compute[187283]: 2025-12-03 14:39:39.780 187287 DEBUG nova.compute.manager [req-1c61d517-3de9-4f9a-b43c-fb2776192457 req-234ebb96-e241-4c05-97de-44e194190535 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Received event network-vif-deleted-c819577b-9486-4fec-bb14-cb1c2713cb60 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:39:40 np0005544118 nova_compute[187283]: 2025-12-03 14:39:40.126 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:40 np0005544118 nova_compute[187283]: 2025-12-03 14:39:40.808 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:44 np0005544118 podman[216643]: 2025-12-03 14:39:44.848632297 +0000 UTC m=+0.080050956 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, architecture=x86_64, config_id=edpm, vendor=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:39:45 np0005544118 nova_compute[187283]: 2025-12-03 14:39:45.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:45 np0005544118 nova_compute[187283]: 2025-12-03 14:39:45.811 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:45 np0005544118 nova_compute[187283]: 2025-12-03 14:39:45.899 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772770.8991656, bb687bfa-6895-44f9-91e0-0ca8de621adf => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:39:45 np0005544118 nova_compute[187283]: 2025-12-03 14:39:45.900 187287 INFO nova.compute.manager [-] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:39:45 np0005544118 nova_compute[187283]: 2025-12-03 14:39:45.921 187287 DEBUG nova.compute.manager [None req-cd4af7e4-6aad-4199-9b20-f0be367435fd - - - - - -] [instance: bb687bfa-6895-44f9-91e0-0ca8de621adf] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:39:47 np0005544118 podman[216664]: 2025-12-03 14:39:47.829827641 +0000 UTC m=+0.056325871 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:39:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:39:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:39:50 np0005544118 nova_compute[187283]: 2025-12-03 14:39:50.129 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:50 np0005544118 nova_compute[187283]: 2025-12-03 14:39:50.785 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772775.783033, 973264bb-ebfb-4c89-a78d-154cf5c6c4ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:39:50 np0005544118 nova_compute[187283]: 2025-12-03 14:39:50.785 187287 INFO nova.compute.manager [-] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:39:50 np0005544118 nova_compute[187283]: 2025-12-03 14:39:50.813 187287 DEBUG nova.compute.manager [None req-195fadc5-d942-4df7-8153-f3be83b66d75 - - - - - -] [instance: 973264bb-ebfb-4c89-a78d-154cf5c6c4ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:39:50 np0005544118 nova_compute[187283]: 2025-12-03 14:39:50.814 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:55 np0005544118 nova_compute[187283]: 2025-12-03 14:39:55.173 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:55 np0005544118 nova_compute[187283]: 2025-12-03 14:39:55.816 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:39:57 np0005544118 podman[216687]: 2025-12-03 14:39:57.847336533 +0000 UTC m=+0.074380120 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:40:00 np0005544118 nova_compute[187283]: 2025-12-03 14:40:00.175 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:00 np0005544118 podman[216704]: 2025-12-03 14:40:00.276375107 +0000 UTC m=+0.073586067 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:40:00 np0005544118 nova_compute[187283]: 2025-12-03 14:40:00.829 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:00.972 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:00.972 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:00.972 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:05 np0005544118 nova_compute[187283]: 2025-12-03 14:40:05.176 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:05 np0005544118 podman[197639]: time="2025-12-03T14:40:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:40:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:40:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:40:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:40:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  3 09:40:05 np0005544118 nova_compute[187283]: 2025-12-03 14:40:05.830 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:05 np0005544118 podman[216730]: 2025-12-03 14:40:05.842363983 +0000 UTC m=+0.079514591 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:40:07 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:07Z|00193|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:40:10 np0005544118 nova_compute[187283]: 2025-12-03 14:40:10.177 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:10 np0005544118 nova_compute[187283]: 2025-12-03 14:40:10.832 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:15 np0005544118 nova_compute[187283]: 2025-12-03 14:40:15.179 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:15 np0005544118 nova_compute[187283]: 2025-12-03 14:40:15.835 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:15 np0005544118 podman[216757]: 2025-12-03 14:40:15.863406856 +0000 UTC m=+0.081902277 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Dec  3 09:40:18 np0005544118 podman[216779]: 2025-12-03 14:40:18.848444734 +0000 UTC m=+0.072210690 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:40:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:40:19 np0005544118 nova_compute[187283]: 2025-12-03 14:40:19.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:20 np0005544118 nova_compute[187283]: 2025-12-03 14:40:20.180 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:20.272 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:40:20 np0005544118 nova_compute[187283]: 2025-12-03 14:40:20.272 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:20.274 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:40:20 np0005544118 nova_compute[187283]: 2025-12-03 14:40:20.837 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:21.276 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:21 np0005544118 nova_compute[187283]: 2025-12-03 14:40:21.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:23 np0005544118 nova_compute[187283]: 2025-12-03 14:40:23.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:25 np0005544118 nova_compute[187283]: 2025-12-03 14:40:25.181 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:25 np0005544118 nova_compute[187283]: 2025-12-03 14:40:25.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:25 np0005544118 nova_compute[187283]: 2025-12-03 14:40:25.840 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:26 np0005544118 nova_compute[187283]: 2025-12-03 14:40:26.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.629 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.629 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.652 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.652 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.653 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.653 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:40:28 np0005544118 podman[216799]: 2025-12-03 14:40:28.747023533 +0000 UTC m=+0.051056786 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.807 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.807 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5876MB free_disk=73.33404922485352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.808 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:28 np0005544118 nova_compute[187283]: 2025-12-03 14:40:28.808 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.086 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.087 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.109 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.129 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.165 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:40:29 np0005544118 nova_compute[187283]: 2025-12-03 14:40:29.165 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:30 np0005544118 nova_compute[187283]: 2025-12-03 14:40:30.160 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:30 np0005544118 nova_compute[187283]: 2025-12-03 14:40:30.183 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:30 np0005544118 podman[216819]: 2025-12-03 14:40:30.820476912 +0000 UTC m=+0.054197623 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:40:30 np0005544118 nova_compute[187283]: 2025-12-03 14:40:30.842 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.047 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.047 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.065 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.145 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.145 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.154 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.154 187287 INFO nova.compute.claims [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.267 187287 DEBUG nova.compute.provider_tree [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.280 187287 DEBUG nova.scheduler.client.report [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.306 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.307 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.387 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.388 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.410 187287 INFO nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.432 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.538 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.540 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.541 187287 INFO nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Creating image(s)#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.542 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.542 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.543 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.556 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.611 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.612 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.613 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.624 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.679 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.681 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.733 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.735 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.736 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.769 187287 DEBUG nova.policy [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.794 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.795 187287 DEBUG nova.virt.disk.api [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.796 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.854 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.856 187287 DEBUG nova.virt.disk.api [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.857 187287 DEBUG nova.objects.instance [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid fec359bd-cda3-4345-b6c9-06687237a914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.893 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.894 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Ensure instance console log exists: /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.894 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.895 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:32 np0005544118 nova_compute[187283]: 2025-12-03 14:40:32.895 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:33 np0005544118 nova_compute[187283]: 2025-12-03 14:40:33.385 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Successfully created port: 9630a169-bea7-4d47-84ca-f5c62a91d2e2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.248 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Successfully updated port: 9630a169-bea7-4d47-84ca-f5c62a91d2e2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.273 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.273 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.273 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.353 187287 DEBUG nova.compute.manager [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-changed-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.353 187287 DEBUG nova.compute.manager [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Refreshing instance network info cache due to event network-changed-9630a169-bea7-4d47-84ca-f5c62a91d2e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.354 187287 DEBUG oslo_concurrency.lockutils [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:40:34 np0005544118 nova_compute[187283]: 2025-12-03 14:40:34.645 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:40:35 np0005544118 nova_compute[187283]: 2025-12-03 14:40:35.185 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:35 np0005544118 nova_compute[187283]: 2025-12-03 14:40:35.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:40:35 np0005544118 nova_compute[187283]: 2025-12-03 14:40:35.610 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:40:35 np0005544118 podman[197639]: time="2025-12-03T14:40:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:40:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:40:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:40:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:40:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  3 09:40:35 np0005544118 nova_compute[187283]: 2025-12-03 14:40:35.844 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:36 np0005544118 podman[216858]: 2025-12-03 14:40:36.879420746 +0000 UTC m=+0.111746589 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.746 187287 DEBUG nova.network.neutron [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updating instance_info_cache with network_info: [{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.775 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.775 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Instance network_info: |[{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.776 187287 DEBUG oslo_concurrency.lockutils [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.776 187287 DEBUG nova.network.neutron [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Refreshing network info cache for port 9630a169-bea7-4d47-84ca-f5c62a91d2e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.779 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Start _get_guest_xml network_info=[{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.783 187287 WARNING nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.789 187287 DEBUG nova.virt.libvirt.host [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.790 187287 DEBUG nova.virt.libvirt.host [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.793 187287 DEBUG nova.virt.libvirt.host [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.794 187287 DEBUG nova.virt.libvirt.host [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.795 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.795 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.796 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.796 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.796 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.797 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.797 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.797 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.798 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.798 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.798 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.798 187287 DEBUG nova.virt.hardware [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.802 187287 DEBUG nova.virt.libvirt.vif [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-722175843',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-722175843',id=22,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-cohsgmu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:40:32Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=fec359bd-cda3-4345-b6c9-06687237a914,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.803 187287 DEBUG nova.network.os_vif_util [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.804 187287 DEBUG nova.network.os_vif_util [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.805 187287 DEBUG nova.objects.instance [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid fec359bd-cda3-4345-b6c9-06687237a914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.834 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <uuid>fec359bd-cda3-4345-b6c9-06687237a914</uuid>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <name>instance-00000016</name>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-722175843</nova:name>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:40:37</nova:creationTime>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        <nova:port uuid="9630a169-bea7-4d47-84ca-f5c62a91d2e2">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="serial">fec359bd-cda3-4345-b6c9-06687237a914</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="uuid">fec359bd-cda3-4345-b6c9-06687237a914</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.config"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:dd:05:d1"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <target dev="tap9630a169-be"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/console.log" append="off"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:40:37 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:40:37 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:40:37 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:40:37 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.835 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Preparing to wait for external event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.835 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.835 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.836 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.836 187287 DEBUG nova.virt.libvirt.vif [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-722175843',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-722175843',id=22,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-cohsgmu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:40:32Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=fec359bd-cda3-4345-b6c9-06687237a914,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.837 187287 DEBUG nova.network.os_vif_util [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.837 187287 DEBUG nova.network.os_vif_util [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.837 187287 DEBUG os_vif [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.838 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.838 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.839 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.842 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.843 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9630a169-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.843 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9630a169-be, col_values=(('external_ids', {'iface-id': '9630a169-bea7-4d47-84ca-f5c62a91d2e2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:05:d1', 'vm-uuid': 'fec359bd-cda3-4345-b6c9-06687237a914'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.892 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:37 np0005544118 NetworkManager[55710]: <info>  [1764772837.8934] manager: (tap9630a169-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.895 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.900 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.902 187287 INFO os_vif [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be')#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.946 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.947 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.947 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:dd:05:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:40:37 np0005544118 nova_compute[187283]: 2025-12-03 14:40:37.947 187287 INFO nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Using config drive#033[00m
Dec  3 09:40:38 np0005544118 nova_compute[187283]: 2025-12-03 14:40:38.772 187287 INFO nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Creating config drive at /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.config#033[00m
Dec  3 09:40:38 np0005544118 nova_compute[187283]: 2025-12-03 14:40:38.778 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcswf2euj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:38 np0005544118 nova_compute[187283]: 2025-12-03 14:40:38.918 187287 DEBUG oslo_concurrency.processutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcswf2euj" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:38 np0005544118 kernel: tap9630a169-be: entered promiscuous mode
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772838.9768] manager: (tap9630a169-be): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Dec  3 09:40:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:39Z|00194|binding|INFO|Claiming lport 9630a169-bea7-4d47-84ca-f5c62a91d2e2 for this chassis.
Dec  3 09:40:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:39Z|00195|binding|INFO|9630a169-bea7-4d47-84ca-f5c62a91d2e2: Claiming fa:16:3e:dd:05:d1 10.100.0.5
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.021 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.036 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:05:d1 10.100.0.5'], port_security=['fa:16:3e:dd:05:d1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'fec359bd-cda3-4345-b6c9-06687237a914', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=9630a169-bea7-4d47-84ca-f5c62a91d2e2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:40:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:39Z|00196|binding|INFO|Setting lport 9630a169-bea7-4d47-84ca-f5c62a91d2e2 ovn-installed in OVS
Dec  3 09:40:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:39Z|00197|binding|INFO|Setting lport 9630a169-bea7-4d47-84ca-f5c62a91d2e2 up in Southbound
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.038 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 9630a169-bea7-4d47-84ca-f5c62a91d2e2 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.038 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.039 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:40:39 np0005544118 systemd-udevd[216904]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.051 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c96d19a0-7bad-48bb-9cf1-43742e6b2f77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.053 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.054 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.054 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ca764d0b-fc77-4aff-a903-3186c355c24f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.055 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f6e980b5-4229-454e-920f-ed799baa4b4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 systemd-machined[153602]: New machine qemu-18-instance-00000016.
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772839.0676] device (tap9630a169-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772839.0690] device (tap9630a169-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.069 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[23068b37-169e-4e04-91df-76f3902935e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 systemd[1]: Started Virtual Machine qemu-18-instance-00000016.
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.095 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[370982b7-e251-49f6-a228-90414fb1a945]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.123 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[bbafa7b1-bc11-4161-ad53-9be4f2ca4334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.128 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1731071a-265f-4d2e-8aab-a3530985a113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772839.1298] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.157 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a28826ea-4ca4-4104-8cc9-038c2147def5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.159 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0cd3a6-b7c2-4274-81c4-975c1aa1029e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772839.1792] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.184 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[62db4a9d-0a8c-42c5-8cf1-58dcdb7652d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.200 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[90ce3684-4ac9-465f-852d-29d42bade448]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499478, 'reachable_time': 43520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216937, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.213 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ef071e4d-b99e-4ea1-8543-5635afb49cd9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 499478, 'tstamp': 499478}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216938, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.226 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eee069d0-5101-465f-a8b4-356a31acd82d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499478, 'reachable_time': 43520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216939, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.255 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[873c8a47-9b66-468c-8feb-05ac1ab58d82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.318 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3ec4eaaf-90e9-4d14-823a-21d0c8bbc63a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.320 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.320 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.320 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:39 np0005544118 NetworkManager[55710]: <info>  [1764772839.3226] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Dec  3 09:40:39 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.323 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.325 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:39 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:39Z|00198|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.347 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.348 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.349 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[55a8a6a0-903c-41f0-b855-c61ab0e21879]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.350 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:40:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:39.350 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.542 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772839.5418146, fec359bd-cda3-4345-b6c9-06687237a914 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.543 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] VM Started (Lifecycle Event)#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.568 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.573 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772839.5421124, fec359bd-cda3-4345-b6c9-06687237a914 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.573 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.593 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.597 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.613 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:40:39 np0005544118 podman[216977]: 2025-12-03 14:40:39.785588471 +0000 UTC m=+0.056996550 container create 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:40:39 np0005544118 systemd[1]: Started libpod-conmon-7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c.scope.
Dec  3 09:40:39 np0005544118 podman[216977]: 2025-12-03 14:40:39.74960869 +0000 UTC m=+0.021016799 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:40:39 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:40:39 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83c95b7099b5d7d7d758edd13eac2cc118050d2f0ddc85afc9910e2af4cc3153/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.884 187287 DEBUG nova.compute.manager [req-aef42353-0516-4453-bbe8-2230a569d5e7 req-84c7a684-7056-46c4-b387-2ab07fe8b416 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.885 187287 DEBUG oslo_concurrency.lockutils [req-aef42353-0516-4453-bbe8-2230a569d5e7 req-84c7a684-7056-46c4-b387-2ab07fe8b416 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.885 187287 DEBUG oslo_concurrency.lockutils [req-aef42353-0516-4453-bbe8-2230a569d5e7 req-84c7a684-7056-46c4-b387-2ab07fe8b416 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.885 187287 DEBUG oslo_concurrency.lockutils [req-aef42353-0516-4453-bbe8-2230a569d5e7 req-84c7a684-7056-46c4-b387-2ab07fe8b416 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.885 187287 DEBUG nova.compute.manager [req-aef42353-0516-4453-bbe8-2230a569d5e7 req-84c7a684-7056-46c4-b387-2ab07fe8b416 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Processing event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.886 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:40:39 np0005544118 podman[216977]: 2025-12-03 14:40:39.887807166 +0000 UTC m=+0.159215275 container init 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.891 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772839.890964, fec359bd-cda3-4345-b6c9-06687237a914 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.891 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.892 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:40:39 np0005544118 podman[216977]: 2025-12-03 14:40:39.895211331 +0000 UTC m=+0.166619430 container start 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.896 187287 INFO nova.virt.libvirt.driver [-] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Instance spawned successfully.#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.897 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.914 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.920 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.923 187287 DEBUG nova.network.neutron [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updated VIF entry in instance network info cache for port 9630a169-bea7-4d47-84ca-f5c62a91d2e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.923 187287 DEBUG nova.network.neutron [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updating instance_info_cache with network_info: [{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.926 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.926 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.926 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [NOTICE]   (216996) : New worker (216998) forked
Dec  3 09:40:39 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [NOTICE]   (216996) : Loading success.
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.927 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.927 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.927 187287 DEBUG nova.virt.libvirt.driver [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.957 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.960 187287 DEBUG oslo_concurrency.lockutils [req-db1f28e9-197a-4b8f-8a00-9d3e38a74b99 req-821c103a-e6ba-4808-be16-d92f22a48ef6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.993 187287 INFO nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Took 7.45 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:40:39 np0005544118 nova_compute[187283]: 2025-12-03 14:40:39.993 187287 DEBUG nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:40:40 np0005544118 nova_compute[187283]: 2025-12-03 14:40:40.059 187287 INFO nova.compute.manager [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Took 7.94 seconds to build instance.#033[00m
Dec  3 09:40:40 np0005544118 nova_compute[187283]: 2025-12-03 14:40:40.076 187287 DEBUG oslo_concurrency.lockutils [None req-393618fe-cde3-4acc-bab3-ffa89180d8e7 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:40 np0005544118 nova_compute[187283]: 2025-12-03 14:40:40.187 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.992 187287 DEBUG nova.compute.manager [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.993 187287 DEBUG oslo_concurrency.lockutils [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.993 187287 DEBUG oslo_concurrency.lockutils [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.994 187287 DEBUG oslo_concurrency.lockutils [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.994 187287 DEBUG nova.compute.manager [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:41 np0005544118 nova_compute[187283]: 2025-12-03 14:40:41.994 187287 WARNING nova.compute.manager [req-f06eccab-895a-47fa-9a08-81ac6293dec4 req-9662e00f-7d7f-44d2-80a6-1d1d6f9a5a66 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:40:42 np0005544118 nova_compute[187283]: 2025-12-03 14:40:42.893 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:45 np0005544118 nova_compute[187283]: 2025-12-03 14:40:45.189 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:46 np0005544118 podman[217008]: 2025-12-03 14:40:46.84564164 +0000 UTC m=+0.064934148 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, distribution-scope=public)
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.201 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Check if temp file /var/lib/nova/instances/tmpbn904lh1 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.201 187287 DEBUG nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpbn904lh1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fec359bd-cda3-4345-b6c9-06687237a914',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.633 187287 DEBUG oslo_concurrency.processutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.701 187287 DEBUG oslo_concurrency.processutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.702 187287 DEBUG oslo_concurrency.processutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.759 187287 DEBUG oslo_concurrency.processutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:40:47 np0005544118 nova_compute[187283]: 2025-12-03 14:40:47.896 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:40:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:40:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:40:49 np0005544118 podman[217033]: 2025-12-03 14:40:49.820958482 +0000 UTC m=+0.052854048 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:40:50 np0005544118 nova_compute[187283]: 2025-12-03 14:40:50.191 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:51 np0005544118 systemd-logind[795]: New session 32 of user nova.
Dec  3 09:40:51 np0005544118 systemd[1]: Created slice User Slice of UID 42436.
Dec  3 09:40:51 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  3 09:40:51 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  3 09:40:51 np0005544118 systemd[1]: Starting User Manager for UID 42436...
Dec  3 09:40:51 np0005544118 systemd[217063]: Queued start job for default target Main User Target.
Dec  3 09:40:51 np0005544118 systemd[217063]: Created slice User Application Slice.
Dec  3 09:40:51 np0005544118 systemd[217063]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:40:51 np0005544118 systemd[217063]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 09:40:51 np0005544118 systemd[217063]: Reached target Paths.
Dec  3 09:40:51 np0005544118 systemd[217063]: Reached target Timers.
Dec  3 09:40:51 np0005544118 systemd[217063]: Starting D-Bus User Message Bus Socket...
Dec  3 09:40:51 np0005544118 systemd[217063]: Starting Create User's Volatile Files and Directories...
Dec  3 09:40:51 np0005544118 systemd[217063]: Listening on D-Bus User Message Bus Socket.
Dec  3 09:40:51 np0005544118 systemd[217063]: Finished Create User's Volatile Files and Directories.
Dec  3 09:40:51 np0005544118 systemd[217063]: Reached target Sockets.
Dec  3 09:40:51 np0005544118 systemd[217063]: Reached target Basic System.
Dec  3 09:40:51 np0005544118 systemd[217063]: Reached target Main User Target.
Dec  3 09:40:51 np0005544118 systemd[217063]: Startup finished in 153ms.
Dec  3 09:40:51 np0005544118 systemd[1]: Started User Manager for UID 42436.
Dec  3 09:40:51 np0005544118 systemd[1]: Started Session 32 of User nova.
Dec  3 09:40:51 np0005544118 systemd[1]: session-32.scope: Deactivated successfully.
Dec  3 09:40:51 np0005544118 systemd-logind[795]: Session 32 logged out. Waiting for processes to exit.
Dec  3 09:40:51 np0005544118 systemd-logind[795]: Removed session 32.
Dec  3 09:40:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:52Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:05:d1 10.100.0.5
Dec  3 09:40:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:52Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:05:d1 10.100.0.5
Dec  3 09:40:52 np0005544118 nova_compute[187283]: 2025-12-03 14:40:52.900 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.174 187287 DEBUG nova.compute.manager [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.174 187287 DEBUG oslo_concurrency.lockutils [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.175 187287 DEBUG oslo_concurrency.lockutils [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.175 187287 DEBUG oslo_concurrency.lockutils [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.175 187287 DEBUG nova.compute.manager [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:53 np0005544118 nova_compute[187283]: 2025-12-03 14:40:53.175 187287 DEBUG nova.compute.manager [req-c25fa8fb-ea81-4916-8bff-aa90fb824480 req-2e5e8472-b3da-4cfe-8432-ef41b87c8654 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.825 187287 INFO nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Took 7.06 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.826 187287 DEBUG nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.845 187287 DEBUG nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpbn904lh1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='fec359bd-cda3-4345-b6c9-06687237a914',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(cf00b9b6-5eec-4e43-bf10-b5e35ce9237b),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.868 187287 DEBUG nova.objects.instance [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lazy-loading 'migration_context' on Instance uuid fec359bd-cda3-4345-b6c9-06687237a914 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.869 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.871 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.871 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.885 187287 DEBUG nova.virt.libvirt.vif [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-722175843',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-722175843',id=22,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:40:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-cohsgmu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:40:40Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=fec359bd-cda3-4345-b6c9-06687237a914,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.886 187287 DEBUG nova.network.os_vif_util [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converting VIF {"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.886 187287 DEBUG nova.network.os_vif_util [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.887 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updating guest XML with vif config: <interface type="ethernet">
Dec  3 09:40:54 np0005544118 nova_compute[187283]:  <mac address="fa:16:3e:dd:05:d1"/>
Dec  3 09:40:54 np0005544118 nova_compute[187283]:  <model type="virtio"/>
Dec  3 09:40:54 np0005544118 nova_compute[187283]:  <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:40:54 np0005544118 nova_compute[187283]:  <mtu size="1442"/>
Dec  3 09:40:54 np0005544118 nova_compute[187283]:  <target dev="tap9630a169-be"/>
Dec  3 09:40:54 np0005544118 nova_compute[187283]: </interface>
Dec  3 09:40:54 np0005544118 nova_compute[187283]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  3 09:40:54 np0005544118 nova_compute[187283]: 2025-12-03 14:40:54.888 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.192 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.249 187287 DEBUG nova.compute.manager [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.250 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.250 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.250 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.251 187287 DEBUG nova.compute.manager [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.251 187287 WARNING nova.compute.manager [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.251 187287 DEBUG nova.compute.manager [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-changed-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.251 187287 DEBUG nova.compute.manager [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Refreshing instance network info cache due to event network-changed-9630a169-bea7-4d47-84ca-f5c62a91d2e2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.251 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.252 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.252 187287 DEBUG nova.network.neutron [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Refreshing network info cache for port 9630a169-bea7-4d47-84ca-f5c62a91d2e2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.374 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.374 187287 INFO nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.447 187287 INFO nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.950 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:40:55 np0005544118 nova_compute[187283]: 2025-12-03 14:40:55.951 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.455 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.456 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.940 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772856.9396136, fec359bd-cda3-4345-b6c9-06687237a914 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.941 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.949 187287 DEBUG nova.network.neutron [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updated VIF entry in instance network info cache for port 9630a169-bea7-4d47-84ca-f5c62a91d2e2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.949 187287 DEBUG nova.network.neutron [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Updating instance_info_cache with network_info: [{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.966 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.973 187287 DEBUG oslo_concurrency.lockutils [req-2c5d77c5-4c13-492f-b2e0-363189b407b2 req-ad8805ef-cae6-4bcd-879f-b572c1eb28f6 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-fec359bd-cda3-4345-b6c9-06687237a914" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.975 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.983 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.984 187287 DEBUG nova.virt.libvirt.migration [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:40:56 np0005544118 nova_compute[187283]: 2025-12-03 14:40:56.998 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  3 09:40:57 np0005544118 kernel: tap9630a169-be (unregistering): left promiscuous mode
Dec  3 09:40:57 np0005544118 NetworkManager[55710]: <info>  [1764772857.1180] device (tap9630a169-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:40:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:57Z|00199|binding|INFO|Releasing lport 9630a169-bea7-4d47-84ca-f5c62a91d2e2 from this chassis (sb_readonly=0)
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.136 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:57Z|00200|binding|INFO|Setting lport 9630a169-bea7-4d47-84ca-f5c62a91d2e2 down in Southbound
Dec  3 09:40:57 np0005544118 ovn_controller[95637]: 2025-12-03T14:40:57Z|00201|binding|INFO|Removing iface tap9630a169-be ovn-installed in OVS
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.139 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.146 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:05:d1 10.100.0.5'], port_security=['fa:16:3e:dd:05:d1 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3a9d7e7b-04f9-4aed-a199-9003ff5fe58c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'fec359bd-cda3-4345-b6c9-06687237a914', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=9630a169-bea7-4d47-84ca-f5c62a91d2e2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.148 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 9630a169-bea7-4d47-84ca-f5c62a91d2e2 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.150 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.152 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9f844ffd-b351-4507-b803-f8643862cd3d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.154 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.155 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec  3 09:40:57 np0005544118 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Consumed 14.336s CPU time.
Dec  3 09:40:57 np0005544118 systemd-machined[153602]: Machine qemu-18-instance-00000016 terminated.
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.322 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.328 187287 DEBUG nova.compute.manager [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.329 187287 DEBUG oslo_concurrency.lockutils [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.329 187287 DEBUG oslo_concurrency.lockutils [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.329 187287 DEBUG oslo_concurrency.lockutils [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.329 187287 DEBUG nova.compute.manager [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.330 187287 DEBUG nova.compute.manager [req-0568854d-c55c-409f-9b21-69414d1c6444 req-3ecb2845-8b88-4717-a3ba-0b33706158cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.330 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.367 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.368 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.369 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.486 187287 DEBUG nova.virt.libvirt.guest [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid 'fec359bd-cda3-4345-b6c9-06687237a914' (instance-00000016) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.487 187287 INFO nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migration operation has completed#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.487 187287 INFO nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] _post_live_migration() is started..#033[00m
Dec  3 09:40:57 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [NOTICE]   (216996) : haproxy version is 2.8.14-c23fe91
Dec  3 09:40:57 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [NOTICE]   (216996) : path to executable is /usr/sbin/haproxy
Dec  3 09:40:57 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [WARNING]  (216996) : Exiting Master process...
Dec  3 09:40:57 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [ALERT]    (216996) : Current worker (216998) exited with code 143 (Terminated)
Dec  3 09:40:57 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[216992]: [WARNING]  (216996) : All workers exited. Exiting... (0)
Dec  3 09:40:57 np0005544118 systemd[1]: libpod-7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c.scope: Deactivated successfully.
Dec  3 09:40:57 np0005544118 podman[217114]: 2025-12-03 14:40:57.720255168 +0000 UTC m=+0.462247545 container died 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  3 09:40:57 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c-userdata-shm.mount: Deactivated successfully.
Dec  3 09:40:57 np0005544118 systemd[1]: var-lib-containers-storage-overlay-83c95b7099b5d7d7d758edd13eac2cc118050d2f0ddc85afc9910e2af4cc3153-merged.mount: Deactivated successfully.
Dec  3 09:40:57 np0005544118 podman[217114]: 2025-12-03 14:40:57.759913987 +0000 UTC m=+0.501906384 container cleanup 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:40:57 np0005544118 systemd[1]: libpod-conmon-7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c.scope: Deactivated successfully.
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.902 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 podman[217162]: 2025-12-03 14:40:57.974283186 +0000 UTC m=+0.192579909 container remove 7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.979 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c2eb5ab3-572c-460c-a5cd-58b1fe017766]: (4, ('Wed Dec  3 02:40:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c)\n7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c\nWed Dec  3 02:40:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c)\n7e13081b7285a8cd21a0cb22c283a363b5a95fbd089a4257b05ba9a8b9d1377c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.980 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[71552676-aee4-4ac4-9411-f15eb2d6af68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:57 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:57.981 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.983 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:57 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:40:57 np0005544118 nova_compute[187283]: 2025-12-03 14:40:57.998 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.002 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[34759863-004c-4985-b980-f71471a476ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.025 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8f3cbf42-90ec-48b0-ab39-dd4c610ca4f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.026 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f781e670-ac51-4805-a40f-04aa2b454023]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.041 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[068c54df-fed6-43b9-a1c8-dfd05ba64b97]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 499472, 'reachable_time': 28021, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217180, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.044 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:40:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:40:58.044 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[9459df19-97ba-47a4-9c97-5c35b4946c54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:40:58 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.366 187287 DEBUG nova.network.neutron [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Activated binding for port 9630a169-bea7-4d47-84ca-f5c62a91d2e2 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.366 187287 DEBUG nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.367 187287 DEBUG nova.virt.libvirt.vif [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:40:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-722175843',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-722175843',id=22,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:40:39Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-cohsgmu2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:40:45Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=fec359bd-cda3-4345-b6c9-06687237a914,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.368 187287 DEBUG nova.network.os_vif_util [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converting VIF {"id": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "address": "fa:16:3e:dd:05:d1", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9630a169-be", "ovs_interfaceid": "9630a169-bea7-4d47-84ca-f5c62a91d2e2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.368 187287 DEBUG nova.network.os_vif_util [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.369 187287 DEBUG os_vif [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.370 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.371 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9630a169-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.372 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.373 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.377 187287 INFO os_vif [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:05:d1,bridge_name='br-int',has_traffic_filtering=True,id=9630a169-bea7-4d47-84ca-f5c62a91d2e2,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9630a169-be')#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.378 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.378 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.378 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.379 187287 DEBUG nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.379 187287 INFO nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Deleting instance files /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914_del#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.380 187287 INFO nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Deletion of /var/lib/nova/instances/fec359bd-cda3-4345-b6c9-06687237a914_del complete#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.419 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.419 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.420 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.420 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.420 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.420 187287 WARNING nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.420 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.421 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.421 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.421 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.421 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.422 187287 WARNING nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.422 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.422 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.422 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.422 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.423 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.423 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-unplugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.423 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.423 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.423 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.424 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.424 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.424 187287 WARNING nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.424 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.425 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.425 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.425 187287 DEBUG oslo_concurrency.lockutils [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.425 187287 DEBUG nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] No waiting events found dispatching network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:40:59 np0005544118 nova_compute[187283]: 2025-12-03 14:40:59.425 187287 WARNING nova.compute.manager [req-86556256-376a-4299-a301-a772cbd67963 req-954ef9d0-efdb-4bf1-85da-d69a7fc182b5 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Received unexpected event network-vif-plugged-9630a169-bea7-4d47-84ca-f5c62a91d2e2 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:40:59 np0005544118 podman[217181]: 2025-12-03 14:40:59.815204696 +0000 UTC m=+0.047890524 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:41:00 np0005544118 nova_compute[187283]: 2025-12-03 14:41:00.195 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:41:00.973 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:41:00.974 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:41:00.974 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:01 np0005544118 podman[217201]: 2025-12-03 14:41:01.811627894 +0000 UTC m=+0.045664912 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:41:01 np0005544118 systemd[1]: Stopping User Manager for UID 42436...
Dec  3 09:41:01 np0005544118 systemd[217063]: Activating special unit Exit the Session...
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped target Main User Target.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped target Basic System.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped target Paths.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped target Sockets.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped target Timers.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  3 09:41:01 np0005544118 systemd[217063]: Closed D-Bus User Message Bus Socket.
Dec  3 09:41:01 np0005544118 systemd[217063]: Stopped Create User's Volatile Files and Directories.
Dec  3 09:41:01 np0005544118 systemd[217063]: Removed slice User Application Slice.
Dec  3 09:41:01 np0005544118 systemd[217063]: Reached target Shutdown.
Dec  3 09:41:01 np0005544118 systemd[217063]: Finished Exit the Session.
Dec  3 09:41:01 np0005544118 systemd[217063]: Reached target Exit the Session.
Dec  3 09:41:02 np0005544118 systemd[1]: user@42436.service: Deactivated successfully.
Dec  3 09:41:02 np0005544118 systemd[1]: Stopped User Manager for UID 42436.
Dec  3 09:41:02 np0005544118 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  3 09:41:02 np0005544118 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  3 09:41:02 np0005544118 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  3 09:41:02 np0005544118 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  3 09:41:02 np0005544118 systemd[1]: Removed slice User Slice of UID 42436.
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.765 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "fec359bd-cda3-4345-b6c9-06687237a914-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.765 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.765 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "fec359bd-cda3-4345-b6c9-06687237a914-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.788 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.788 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.788 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.789 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.936 187287 WARNING nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.937 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5855MB free_disk=73.33398056030273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.937 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.938 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:03 np0005544118 nova_compute[187283]: 2025-12-03 14:41:03.969 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Migration for instance fec359bd-cda3-4345-b6c9-06687237a914 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.206 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.235 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Migration cf00b9b6-5eec-4e43-bf10-b5e35ce9237b is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.236 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.236 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.281 187287 DEBUG nova.compute.provider_tree [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.295 187287 DEBUG nova.scheduler.client.report [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.318 187287 DEBUG nova.compute.resource_tracker [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.318 187287 DEBUG oslo_concurrency.lockutils [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.323 187287 INFO nova.compute.manager [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.372 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.423 187287 INFO nova.scheduler.client.report [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Deleted allocation for migration cf00b9b6-5eec-4e43-bf10-b5e35ce9237b#033[00m
Dec  3 09:41:04 np0005544118 nova_compute[187283]: 2025-12-03 14:41:04.424 187287 DEBUG nova.virt.libvirt.driver [None req-d0841ddf-8ead-4a55-8d68-07e89b60652a bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  3 09:41:05 np0005544118 nova_compute[187283]: 2025-12-03 14:41:05.197 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:05 np0005544118 podman[197639]: time="2025-12-03T14:41:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:41:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:41:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:41:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:41:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  3 09:41:07 np0005544118 podman[217229]: 2025-12-03 14:41:07.842965373 +0000 UTC m=+0.075212917 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:41:09 np0005544118 ovn_controller[95637]: 2025-12-03T14:41:09Z|00202|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec  3 09:41:09 np0005544118 nova_compute[187283]: 2025-12-03 14:41:09.373 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:10 np0005544118 nova_compute[187283]: 2025-12-03 14:41:10.198 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:12 np0005544118 nova_compute[187283]: 2025-12-03 14:41:12.365 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764772857.3647723, fec359bd-cda3-4345-b6c9-06687237a914 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:41:12 np0005544118 nova_compute[187283]: 2025-12-03 14:41:12.366 187287 INFO nova.compute.manager [-] [instance: fec359bd-cda3-4345-b6c9-06687237a914] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:41:12 np0005544118 nova_compute[187283]: 2025-12-03 14:41:12.417 187287 DEBUG nova.compute.manager [None req-ecab2b82-1f3b-47d0-a7cd-ec3daa50a20e - - - - - -] [instance: fec359bd-cda3-4345-b6c9-06687237a914] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:41:14 np0005544118 nova_compute[187283]: 2025-12-03 14:41:14.374 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:15 np0005544118 nova_compute[187283]: 2025-12-03 14:41:15.200 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:16 np0005544118 nova_compute[187283]: 2025-12-03 14:41:16.781 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:17 np0005544118 podman[217257]: 2025-12-03 14:41:17.826561065 +0000 UTC m=+0.058426830 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:41:19 np0005544118 nova_compute[187283]: 2025-12-03 14:41:19.376 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:41:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:41:19 np0005544118 nova_compute[187283]: 2025-12-03 14:41:19.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:19 np0005544118 nova_compute[187283]: 2025-12-03 14:41:19.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:41:20 np0005544118 nova_compute[187283]: 2025-12-03 14:41:20.202 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:20 np0005544118 podman[217278]: 2025-12-03 14:41:20.855567257 +0000 UTC m=+0.087346856 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  3 09:41:21 np0005544118 nova_compute[187283]: 2025-12-03 14:41:21.636 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:21 np0005544118 nova_compute[187283]: 2025-12-03 14:41:21.637 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:21 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:41:24 np0005544118 nova_compute[187283]: 2025-12-03 14:41:24.377 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:24 np0005544118 nova_compute[187283]: 2025-12-03 14:41:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:25 np0005544118 nova_compute[187283]: 2025-12-03 14:41:25.204 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:27 np0005544118 nova_compute[187283]: 2025-12-03 14:41:27.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:27 np0005544118 nova_compute[187283]: 2025-12-03 14:41:27.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:27 np0005544118 nova_compute[187283]: 2025-12-03 14:41:27.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:41:27 np0005544118 nova_compute[187283]: 2025-12-03 14:41:27.624 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:41:28 np0005544118 nova_compute[187283]: 2025-12-03 14:41:28.622 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:28 np0005544118 nova_compute[187283]: 2025-12-03 14:41:28.751 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.379 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.626 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.627 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.649 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.650 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.650 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.650 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.828 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.830 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5883MB free_disk=73.3339614868164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.830 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.830 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.888 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.888 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.908 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.925 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.927 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:41:29 np0005544118 nova_compute[187283]: 2025-12-03 14:41:29.927 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:41:30 np0005544118 nova_compute[187283]: 2025-12-03 14:41:30.206 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:30 np0005544118 podman[217299]: 2025-12-03 14:41:30.812298989 +0000 UTC m=+0.048918022 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Dec  3 09:41:31 np0005544118 nova_compute[187283]: 2025-12-03 14:41:31.923 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:32 np0005544118 podman[217318]: 2025-12-03 14:41:32.820159898 +0000 UTC m=+0.054993886 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:41:33 np0005544118 nova_compute[187283]: 2025-12-03 14:41:33.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:34 np0005544118 nova_compute[187283]: 2025-12-03 14:41:34.382 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:35 np0005544118 nova_compute[187283]: 2025-12-03 14:41:35.207 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:35 np0005544118 podman[197639]: time="2025-12-03T14:41:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:41:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:41:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:41:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:41:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec  3 09:41:37 np0005544118 nova_compute[187283]: 2025-12-03 14:41:37.633 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:37 np0005544118 nova_compute[187283]: 2025-12-03 14:41:37.653 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:41:37 np0005544118 nova_compute[187283]: 2025-12-03 14:41:37.653 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:41:38 np0005544118 podman[217345]: 2025-12-03 14:41:38.843413437 +0000 UTC m=+0.077386955 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec  3 09:41:39 np0005544118 nova_compute[187283]: 2025-12-03 14:41:39.383 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:40 np0005544118 nova_compute[187283]: 2025-12-03 14:41:40.209 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:44 np0005544118 nova_compute[187283]: 2025-12-03 14:41:44.386 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:44 np0005544118 ovn_controller[95637]: 2025-12-03T14:41:44Z|00203|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:41:45 np0005544118 nova_compute[187283]: 2025-12-03 14:41:45.211 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:48 np0005544118 podman[217372]: 2025-12-03 14:41:48.81856171 +0000 UTC m=+0.054528103 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec  3 09:41:49 np0005544118 nova_compute[187283]: 2025-12-03 14:41:49.388 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:41:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:41:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:41:50 np0005544118 nova_compute[187283]: 2025-12-03 14:41:50.213 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:51 np0005544118 podman[217395]: 2025-12-03 14:41:51.826469749 +0000 UTC m=+0.055521541 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:41:54 np0005544118 nova_compute[187283]: 2025-12-03 14:41:54.389 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:55 np0005544118 nova_compute[187283]: 2025-12-03 14:41:55.215 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:41:59 np0005544118 nova_compute[187283]: 2025-12-03 14:41:59.390 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:00 np0005544118 nova_compute[187283]: 2025-12-03 14:42:00.217 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:00.975 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:00.975 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:00.975 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:01 np0005544118 podman[217415]: 2025-12-03 14:42:01.814512172 +0000 UTC m=+0.048100639 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:42:03 np0005544118 podman[217434]: 2025-12-03 14:42:03.819265477 +0000 UTC m=+0.049220689 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:42:04 np0005544118 nova_compute[187283]: 2025-12-03 14:42:04.413 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:05 np0005544118 nova_compute[187283]: 2025-12-03 14:42:05.220 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:05 np0005544118 podman[197639]: time="2025-12-03T14:42:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:42:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:42:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:42:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:42:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Dec  3 09:42:09 np0005544118 nova_compute[187283]: 2025-12-03 14:42:09.415 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:09 np0005544118 podman[217458]: 2025-12-03 14:42:09.860839366 +0000 UTC m=+0.085882725 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  3 09:42:10 np0005544118 nova_compute[187283]: 2025-12-03 14:42:10.221 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:10 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:10.338 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:42:10 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:10.339 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:42:10 np0005544118 nova_compute[187283]: 2025-12-03 14:42:10.342 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:14 np0005544118 nova_compute[187283]: 2025-12-03 14:42:14.418 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:15 np0005544118 nova_compute[187283]: 2025-12-03 14:42:15.222 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:19 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:19.342 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:42:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:42:19 np0005544118 nova_compute[187283]: 2025-12-03 14:42:19.420 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:19 np0005544118 podman[217484]: 2025-12-03 14:42:19.824385852 +0000 UTC m=+0.056112646 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:42:20 np0005544118 nova_compute[187283]: 2025-12-03 14:42:20.224 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.055 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.056 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.077 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.213 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.214 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.222 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.223 187287 INFO nova.compute.claims [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.335 187287 DEBUG nova.compute.provider_tree [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.354 187287 DEBUG nova.scheduler.client.report [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.380 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.381 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.422 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.422 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.446 187287 INFO nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.467 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.577 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.578 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.578 187287 INFO nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Creating image(s)#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.579 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.579 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.580 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.593 187287 DEBUG nova.policy [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '050e065161ed4e4dbf457584926aae78', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '90982dffa8cd42a2b3280941bc8b991e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.596 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.657 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.658 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.659 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.672 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.737 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.738 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.828 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.829 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.830 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.891 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.892 187287 DEBUG nova.virt.disk.api [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Checking if we can resize image /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.892 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.952 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.953 187287 DEBUG nova.virt.disk.api [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Cannot resize image /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.954 187287 DEBUG nova.objects.instance [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'migration_context' on Instance uuid c5ae593f-cc24-468c-a57d-0e9e8db00913 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.969 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.969 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Ensure instance console log exists: /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.970 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.970 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:21 np0005544118 nova_compute[187283]: 2025-12-03 14:42:21.970 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:22 np0005544118 nova_compute[187283]: 2025-12-03 14:42:22.188 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Successfully created port: a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:42:22 np0005544118 nova_compute[187283]: 2025-12-03 14:42:22.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:22 np0005544118 podman[217521]: 2025-12-03 14:42:22.821451397 +0000 UTC m=+0.055280775 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd)
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.800 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Successfully updated port: a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.832 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.832 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquired lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.832 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.885 187287 DEBUG nova.compute.manager [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-changed-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.885 187287 DEBUG nova.compute.manager [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Refreshing instance network info cache due to event network-changed-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.886 187287 DEBUG oslo_concurrency.lockutils [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:42:23 np0005544118 nova_compute[187283]: 2025-12-03 14:42:23.989 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.424 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.567 187287 DEBUG nova.network.neutron [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updating instance_info_cache with network_info: [{"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.591 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Releasing lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.591 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Instance network_info: |[{"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.592 187287 DEBUG oslo_concurrency.lockutils [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.592 187287 DEBUG nova.network.neutron [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Refreshing network info cache for port a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.595 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Start _get_guest_xml network_info=[{"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.598 187287 WARNING nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.604 187287 DEBUG nova.virt.libvirt.host [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.604 187287 DEBUG nova.virt.libvirt.host [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.609 187287 DEBUG nova.virt.libvirt.host [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.610 187287 DEBUG nova.virt.libvirt.host [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.611 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.611 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.612 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.612 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.612 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.612 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.612 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.613 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.613 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.613 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.613 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.613 187287 DEBUG nova.virt.hardware [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.617 187287 DEBUG nova.virt.libvirt.vif [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:42:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1778087786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1778087786',id=24,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-g25qkqlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:42:21Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=c5ae593f-cc24-468c-a57d-0e9e8db00913,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.617 187287 DEBUG nova.network.os_vif_util [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.618 187287 DEBUG nova.network.os_vif_util [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.619 187287 DEBUG nova.objects.instance [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'pci_devices' on Instance uuid c5ae593f-cc24-468c-a57d-0e9e8db00913 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.633 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <uuid>c5ae593f-cc24-468c-a57d-0e9e8db00913</uuid>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <name>instance-00000018</name>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteStrategies-server-1778087786</nova:name>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:42:24</nova:creationTime>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:user uuid="050e065161ed4e4dbf457584926aae78">tempest-TestExecuteStrategies-270472559-project-member</nova:user>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:project uuid="90982dffa8cd42a2b3280941bc8b991e">tempest-TestExecuteStrategies-270472559</nova:project>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        <nova:port uuid="a5b08d51-ec89-45e6-9f3c-c9fb406cfd42">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="serial">c5ae593f-cc24-468c-a57d-0e9e8db00913</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="uuid">c5ae593f-cc24-468c-a57d-0e9e8db00913</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.config"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:38:f1:73"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <target dev="tapa5b08d51-ec"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/console.log" append="off"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:42:24 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:42:24 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:42:24 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:42:24 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.634 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Preparing to wait for external event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.634 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.634 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.635 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.635 187287 DEBUG nova.virt.libvirt.vif [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:42:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1778087786',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1778087786',id=24,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-g25qkqlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:42:21Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=c5ae593f-cc24-468c-a57d-0e9e8db00913,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.636 187287 DEBUG nova.network.os_vif_util [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.636 187287 DEBUG nova.network.os_vif_util [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.636 187287 DEBUG os_vif [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.637 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.637 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.637 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.640 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.640 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa5b08d51-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.641 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa5b08d51-ec, col_values=(('external_ids', {'iface-id': 'a5b08d51-ec89-45e6-9f3c-c9fb406cfd42', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:f1:73', 'vm-uuid': 'c5ae593f-cc24-468c-a57d-0e9e8db00913'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.642 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:24 np0005544118 NetworkManager[55710]: <info>  [1764772944.6432] manager: (tapa5b08d51-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.644 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.648 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.649 187287 INFO os_vif [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec')#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.876 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.877 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.877 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] No VIF found with MAC fa:16:3e:38:f1:73, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:42:24 np0005544118 nova_compute[187283]: 2025-12-03 14:42:24.878 187287 INFO nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Using config drive#033[00m
Dec  3 09:42:25 np0005544118 nova_compute[187283]: 2025-12-03 14:42:25.225 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:25 np0005544118 nova_compute[187283]: 2025-12-03 14:42:25.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:26 np0005544118 nova_compute[187283]: 2025-12-03 14:42:26.591 187287 INFO nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Creating config drive at /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.config#033[00m
Dec  3 09:42:26 np0005544118 nova_compute[187283]: 2025-12-03 14:42:26.596 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprg4h29w_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:26 np0005544118 nova_compute[187283]: 2025-12-03 14:42:26.720 187287 DEBUG oslo_concurrency.processutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprg4h29w_" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:26 np0005544118 kernel: tapa5b08d51-ec: entered promiscuous mode
Dec  3 09:42:26 np0005544118 NetworkManager[55710]: <info>  [1764772946.7854] manager: (tapa5b08d51-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Dec  3 09:42:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:26Z|00204|binding|INFO|Claiming lport a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 for this chassis.
Dec  3 09:42:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:26Z|00205|binding|INFO|a5b08d51-ec89-45e6-9f3c-c9fb406cfd42: Claiming fa:16:3e:38:f1:73 10.100.0.5
Dec  3 09:42:26 np0005544118 nova_compute[187283]: 2025-12-03 14:42:26.785 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.793 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:f1:73 10.100.0.5'], port_security=['fa:16:3e:38:f1:73 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c5ae593f-cc24-468c-a57d-0e9e8db00913', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.794 104491 INFO neutron.agent.ovn.metadata.agent [-] Port a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.795 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:42:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:26Z|00206|binding|INFO|Setting lport a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 ovn-installed in OVS
Dec  3 09:42:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:26Z|00207|binding|INFO|Setting lport a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 up in Southbound
Dec  3 09:42:26 np0005544118 nova_compute[187283]: 2025-12-03 14:42:26.801 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.809 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[472c89bd-a195-41cc-8b6e-5763f05c7756]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.809 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap267c5b5d-11 in ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.811 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap267c5b5d-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.811 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7d0ae2-4caf-47bc-b252-11a1651044f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.811 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[87e6183d-a336-4d9a-9c29-335b706437c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 systemd-udevd[217561]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:42:26 np0005544118 NetworkManager[55710]: <info>  [1764772946.8269] device (tapa5b08d51-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:42:26 np0005544118 NetworkManager[55710]: <info>  [1764772946.8288] device (tapa5b08d51-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.828 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[47b72283-06d1-4fa3-a929-441ccb8d9c72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 systemd-machined[153602]: New machine qemu-19-instance-00000018.
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.841 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[aae68e3d-ef11-4f18-9610-b7e66c6e170b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 systemd[1]: Started Virtual Machine qemu-19-instance-00000018.
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.871 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[696283b6-a5c2-4043-bd45-561936555f1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.876 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[13f2cdfd-9ada-4ec2-b878-564c6ecfbe57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 NetworkManager[55710]: <info>  [1764772946.8772] manager: (tap267c5b5d-10): new Veth device (/org/freedesktop/NetworkManager/Devices/81)
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.909 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[614ff36f-521f-4a72-bf90-69fef503ed75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.912 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[ad7df707-497d-4cc9-9a08-5f3151e8c4aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 NetworkManager[55710]: <info>  [1764772946.9373] device (tap267c5b5d-10): carrier: link connected
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.944 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[527f8194-5235-451a-8041-c1db8f07f90a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.962 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[cfc3c380-df2d-4898-8c50-2469fc06cf18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510254, 'reachable_time': 17633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217594, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.978 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3e337083-c9ae-4bf1-b023-55b13bea9d12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefe:71b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510254, 'tstamp': 510254}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217595, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:26.997 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[3e319389-bb35-4af3-a647-10ca34382c6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510254, 'reachable_time': 17633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217596, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.027 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[309c25cf-6321-42bd-be2c-27b2dbf43f22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.083 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ea99c890-97e2-410b-85a6-86fd83f4f733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.085 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.085 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.086 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:27 np0005544118 NetworkManager[55710]: <info>  [1764772947.1292] manager: (tap267c5b5d-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec  3 09:42:27 np0005544118 kernel: tap267c5b5d-10: entered promiscuous mode
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.134 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.132 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:27Z|00208|binding|INFO|Releasing lport f97a1eb5-3e39-4381-b113-959844eaa3b1 from this chassis (sb_readonly=0)
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.135 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.136 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.139 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.139 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f53f01-ef36-49d0-8eb9-73097755f480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.140 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/267c5b5d-1150-48df-8bea-7890da55de3f.pid.haproxy
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 267c5b5d-1150-48df-8bea-7890da55de3f
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:42:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:42:27.141 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'env', 'PROCESS_TAG=haproxy-267c5b5d-1150-48df-8bea-7890da55de3f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/267c5b5d-1150-48df-8bea-7890da55de3f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.149 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:27 np0005544118 podman[217626]: 2025-12-03 14:42:27.504152148 +0000 UTC m=+0.048819179 container create aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:42:27 np0005544118 systemd[1]: Started libpod-conmon-aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37.scope.
Dec  3 09:42:27 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:42:27 np0005544118 podman[217626]: 2025-12-03 14:42:27.475397646 +0000 UTC m=+0.020064737 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:42:27 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6affa18dc932826467200d4b817b8d25532ad9d58d402b889d00945c78302ae5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:42:27 np0005544118 podman[217626]: 2025-12-03 14:42:27.583134496 +0000 UTC m=+0.127801557 container init aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:42:27 np0005544118 podman[217626]: 2025-12-03 14:42:27.588115731 +0000 UTC m=+0.132782762 container start aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:27 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [NOTICE]   (217645) : New worker (217647) forked
Dec  3 09:42:27 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [NOTICE]   (217645) : Loading success.
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.870 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772947.8701956, c5ae593f-cc24-468c-a57d-0e9e8db00913 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.871 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] VM Started (Lifecycle Event)#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.905 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.909 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772947.8709724, c5ae593f-cc24-468c-a57d-0e9e8db00913 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.909 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.927 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.931 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.966 187287 DEBUG nova.compute.manager [req-f889e4f0-6a42-47b5-9254-860ca6fd304f req-b778eabe-94d3-49c4-b8aa-ddadc85591df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.966 187287 DEBUG oslo_concurrency.lockutils [req-f889e4f0-6a42-47b5-9254-860ca6fd304f req-b778eabe-94d3-49c4-b8aa-ddadc85591df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.966 187287 DEBUG oslo_concurrency.lockutils [req-f889e4f0-6a42-47b5-9254-860ca6fd304f req-b778eabe-94d3-49c4-b8aa-ddadc85591df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.967 187287 DEBUG oslo_concurrency.lockutils [req-f889e4f0-6a42-47b5-9254-860ca6fd304f req-b778eabe-94d3-49c4-b8aa-ddadc85591df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.967 187287 DEBUG nova.compute.manager [req-f889e4f0-6a42-47b5-9254-860ca6fd304f req-b778eabe-94d3-49c4-b8aa-ddadc85591df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Processing event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.968 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.971 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.974 187287 INFO nova.virt.libvirt.driver [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Instance spawned successfully.#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.974 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.976 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.976 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764772947.9708548, c5ae593f-cc24-468c-a57d-0e9e8db00913 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.976 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.994 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.995 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.995 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.995 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.996 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:27 np0005544118 nova_compute[187283]: 2025-12-03 14:42:27.996 187287 DEBUG nova.virt.libvirt.driver [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.000 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.003 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.041 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.070 187287 INFO nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Took 6.49 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.070 187287 DEBUG nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.150 187287 INFO nova.compute.manager [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Took 7.02 seconds to build instance.#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.193 187287 DEBUG oslo_concurrency.lockutils [None req-c9bfa16c-6685-40e2-95f5-9a19f0e765e1 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.695 187287 DEBUG nova.network.neutron [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updated VIF entry in instance network info cache for port a5b08d51-ec89-45e6-9f3c-c9fb406cfd42. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.696 187287 DEBUG nova.network.neutron [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updating instance_info_cache with network_info: [{"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:42:28 np0005544118 nova_compute[187283]: 2025-12-03 14:42:28.728 187287 DEBUG oslo_concurrency.lockutils [req-7080de3d-84dd-4805-910b-bb942ca73b50 req-f9d36878-109f-4795-9c2a-ca84dcc8e582 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:42:29 np0005544118 nova_compute[187283]: 2025-12-03 14:42:29.642 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.135 187287 DEBUG nova.compute.manager [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.136 187287 DEBUG oslo_concurrency.lockutils [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.136 187287 DEBUG oslo_concurrency.lockutils [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.137 187287 DEBUG oslo_concurrency.lockutils [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.137 187287 DEBUG nova.compute.manager [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] No waiting events found dispatching network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.137 187287 WARNING nova.compute.manager [req-8bd8e6f1-838c-4bb9-a257-4958e5cebcfc req-99980251-e0f5-4f36-8c7c-913f85f10b26 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received unexpected event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.228 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.790 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.791 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.791 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:42:30 np0005544118 nova_compute[187283]: 2025-12-03 14:42:30.792 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid c5ae593f-cc24-468c-a57d-0e9e8db00913 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:42:32 np0005544118 podman[217665]: 2025-12-03 14:42:32.431600984 +0000 UTC m=+0.054694809 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:42:33 np0005544118 nova_compute[187283]: 2025-12-03 14:42:33.998 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updating instance_info_cache with network_info: [{"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.033 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-c5ae593f-cc24-468c-a57d-0e9e8db00913" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.033 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.034 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.052 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.053 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.054 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.054 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:42:34 np0005544118 podman[217684]: 2025-12-03 14:42:34.157517445 +0000 UTC m=+0.053843145 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.166 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.221 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.223 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.279 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.437 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.438 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5743MB free_disk=73.33320617675781GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.439 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.439 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.645 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.657 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance c5ae593f-cc24-468c-a57d-0e9e8db00913 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.658 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.658 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.694 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.706 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.738 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:42:34 np0005544118 nova_compute[187283]: 2025-12-03 14:42:34.739 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:42:35 np0005544118 nova_compute[187283]: 2025-12-03 14:42:35.231 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:35 np0005544118 podman[197639]: time="2025-12-03T14:42:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:42:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:42:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:42:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:42:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3064 "" "Go-http-client/1.1"
Dec  3 09:42:37 np0005544118 nova_compute[187283]: 2025-12-03 14:42:37.734 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:38 np0005544118 nova_compute[187283]: 2025-12-03 14:42:38.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:42:38 np0005544118 nova_compute[187283]: 2025-12-03 14:42:38.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:42:39 np0005544118 nova_compute[187283]: 2025-12-03 14:42:39.647 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:40 np0005544118 nova_compute[187283]: 2025-12-03 14:42:40.232 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:40 np0005544118 podman[217728]: 2025-12-03 14:42:40.86593637 +0000 UTC m=+0.091453938 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:42:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:41Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:f1:73 10.100.0.5
Dec  3 09:42:41 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:41Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:f1:73 10.100.0.5
Dec  3 09:42:44 np0005544118 nova_compute[187283]: 2025-12-03 14:42:44.649 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:45 np0005544118 nova_compute[187283]: 2025-12-03 14:42:45.266 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:42:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:42:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:42:49 np0005544118 nova_compute[187283]: 2025-12-03 14:42:49.651 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:50 np0005544118 nova_compute[187283]: 2025-12-03 14:42:50.270 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:50 np0005544118 podman[217754]: 2025-12-03 14:42:50.818449016 +0000 UTC m=+0.051887392 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec  3 09:42:53 np0005544118 podman[217775]: 2025-12-03 14:42:53.817595017 +0000 UTC m=+0.054969196 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  3 09:42:54 np0005544118 nova_compute[187283]: 2025-12-03 14:42:54.653 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:55 np0005544118 nova_compute[187283]: 2025-12-03 14:42:55.276 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:42:56 np0005544118 ovn_controller[95637]: 2025-12-03T14:42:56Z|00209|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Dec  3 09:42:59 np0005544118 nova_compute[187283]: 2025-12-03 14:42:59.659 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:00 np0005544118 nova_compute[187283]: 2025-12-03 14:43:00.279 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:00.982 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:00.983 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:00.984 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:02 np0005544118 podman[217796]: 2025-12-03 14:43:02.821426345 +0000 UTC m=+0.048331695 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  3 09:43:04 np0005544118 nova_compute[187283]: 2025-12-03 14:43:04.661 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:04 np0005544118 podman[217817]: 2025-12-03 14:43:04.814610996 +0000 UTC m=+0.045408456 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:43:05 np0005544118 nova_compute[187283]: 2025-12-03 14:43:05.280 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:05 np0005544118 podman[197639]: time="2025-12-03T14:43:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:43:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:43:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:43:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:43:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3068 "" "Go-http-client/1.1"
Dec  3 09:43:09 np0005544118 nova_compute[187283]: 2025-12-03 14:43:09.662 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:10 np0005544118 nova_compute[187283]: 2025-12-03 14:43:10.282 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:11 np0005544118 podman[217841]: 2025-12-03 14:43:11.83766545 +0000 UTC m=+0.068908996 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec  3 09:43:14 np0005544118 nova_compute[187283]: 2025-12-03 14:43:14.664 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:15 np0005544118 nova_compute[187283]: 2025-12-03 14:43:15.284 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:16 np0005544118 nova_compute[187283]: 2025-12-03 14:43:16.148 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Creating tmpfile /var/lib/nova/instances/tmplv5i5xmt to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:43:16 np0005544118 nova_compute[187283]: 2025-12-03 14:43:16.149 187287 DEBUG nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplv5i5xmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:43:17 np0005544118 nova_compute[187283]: 2025-12-03 14:43:17.147 187287 DEBUG nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplv5i5xmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='10be4230-b79b-4485-8ff7-2d1fba98d5ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:43:17 np0005544118 nova_compute[187283]: 2025-12-03 14:43:17.235 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:43:17 np0005544118 nova_compute[187283]: 2025-12-03 14:43:17.235 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:43:17 np0005544118 nova_compute[187283]: 2025-12-03 14:43:17.235 187287 DEBUG nova.network.neutron [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.458 187287 DEBUG nova.network.neutron [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Updating instance_info_cache with network_info: [{"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.574 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.576 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplv5i5xmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='10be4230-b79b-4485-8ff7-2d1fba98d5ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.576 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Creating instance directory: /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.576 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Creating disk.info with the contents: {'/var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk': 'qcow2', '/var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.577 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.577 187287 DEBUG nova.objects.instance [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 10be4230-b79b-4485-8ff7-2d1fba98d5ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.665 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.732 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.733 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.734 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.750 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.805 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.806 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.840 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.841 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.842 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.891 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.892 187287 DEBUG nova.virt.disk.api [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.893 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.950 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.951 187287 DEBUG nova.virt.disk.api [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.952 187287 DEBUG nova.objects.instance [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 10be4230-b79b-4485-8ff7-2d1fba98d5ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.968 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.996 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config 485376" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.997 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config to /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:43:18 np0005544118 nova_compute[187283]: 2025-12-03 14:43:18.998 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:43:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.482 187287 DEBUG oslo_concurrency.processutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk.config /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.483 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.484 187287 DEBUG nova.virt.libvirt.vif [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:42:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-944143149',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-944143149',id=23,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:42:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-b5vvgenk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:42:13Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=10be4230-b79b-4485-8ff7-2d1fba98d5ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.485 187287 DEBUG nova.network.os_vif_util [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.486 187287 DEBUG nova.network.os_vif_util [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.486 187287 DEBUG os_vif [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.487 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.487 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.488 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.492 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.493 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e4e961e-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.493 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e4e961e-04, col_values=(('external_ids', {'iface-id': '3e4e961e-0428-44c0-a5bd-2bcd08106c85', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:76:64:70', 'vm-uuid': '10be4230-b79b-4485-8ff7-2d1fba98d5ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.495 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:19 np0005544118 NetworkManager[55710]: <info>  [1764772999.4966] manager: (tap3e4e961e-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.498 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.502 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.503 187287 INFO os_vif [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04')#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.503 187287 DEBUG nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:43:19 np0005544118 nova_compute[187283]: 2025-12-03 14:43:19.504 187287 DEBUG nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplv5i5xmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='10be4230-b79b-4485-8ff7-2d1fba98d5ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:43:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:20.090 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:43:20 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:20.091 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:43:20 np0005544118 nova_compute[187283]: 2025-12-03 14:43:20.126 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:20 np0005544118 nova_compute[187283]: 2025-12-03 14:43:20.286 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:20 np0005544118 nova_compute[187283]: 2025-12-03 14:43:20.781 187287 DEBUG nova.network.neutron [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Port 3e4e961e-0428-44c0-a5bd-2bcd08106c85 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:43:20 np0005544118 nova_compute[187283]: 2025-12-03 14:43:20.782 187287 DEBUG nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmplv5i5xmt',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='10be4230-b79b-4485-8ff7-2d1fba98d5ab',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:43:20 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:43:20 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:43:21 np0005544118 podman[217891]: 2025-12-03 14:43:21.011431837 +0000 UTC m=+0.066522731 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.tags=minimal rhel9)
Dec  3 09:43:21 np0005544118 kernel: tap3e4e961e-04: entered promiscuous mode
Dec  3 09:43:21 np0005544118 NetworkManager[55710]: <info>  [1764773001.1206] manager: (tap3e4e961e-04): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Dec  3 09:43:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:21Z|00210|binding|INFO|Claiming lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 for this additional chassis.
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.122 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:21Z|00211|binding|INFO|3e4e961e-0428-44c0-a5bd-2bcd08106c85: Claiming fa:16:3e:76:64:70 10.100.0.4
Dec  3 09:43:21 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:21Z|00212|binding|INFO|Setting lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 ovn-installed in OVS
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.147 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:21 np0005544118 systemd-udevd[217939]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:43:21 np0005544118 NetworkManager[55710]: <info>  [1764773001.1716] device (tap3e4e961e-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:43:21 np0005544118 NetworkManager[55710]: <info>  [1764773001.1736] device (tap3e4e961e-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.193 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:21 np0005544118 systemd-machined[153602]: New machine qemu-20-instance-00000017.
Dec  3 09:43:21 np0005544118 systemd[1]: Started Virtual Machine qemu-20-instance-00000017.
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.859 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773001.8589492, 10be4230-b79b-4485-8ff7-2d1fba98d5ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.859 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] VM Started (Lifecycle Event)#033[00m
Dec  3 09:43:21 np0005544118 nova_compute[187283]: 2025-12-03 14:43:21.880 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.701 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773002.700602, 10be4230-b79b-4485-8ff7-2d1fba98d5ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.703 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.733 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.736 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:43:22 np0005544118 nova_compute[187283]: 2025-12-03 14:43:22.768 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:43:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:24Z|00213|binding|INFO|Claiming lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 for this chassis.
Dec  3 09:43:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:24Z|00214|binding|INFO|3e4e961e-0428-44c0-a5bd-2bcd08106c85: Claiming fa:16:3e:76:64:70 10.100.0.4
Dec  3 09:43:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:24Z|00215|binding|INFO|Setting lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 up in Southbound
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.020 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:64:70 10.100.0.4'], port_security=['fa:16:3e:76:64:70 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '10be4230-b79b-4485-8ff7-2d1fba98d5ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=3e4e961e-0428-44c0-a5bd-2bcd08106c85) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.022 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 3e4e961e-0428-44c0-a5bd-2bcd08106c85 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f bound to our chassis#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.023 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.036 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd8774b-fb8f-47ee-8abb-40bf415240fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.065 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbfd123-33ba-4046-9731-f0f4e0a39ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.068 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0db95f22-736f-454f-9a23-d66ac2c5899e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.095 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[21c4f9a2-3dcc-423e-a0d7-4d8da167e874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.110 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ca358e12-d9b3-4fb2-a219-c75800c58c45]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510254, 'reachable_time': 17633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217966, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.125 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ca08bf77-0279-4dce-983b-17c3a0b67694]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510266, 'tstamp': 510266}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217967, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510268, 'tstamp': 510268}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217967, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.127 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.129 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.129 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.130 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:24 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:24.130 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.185 187287 INFO nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Post operation of migration started#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.496 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:24 np0005544118 podman[217968]: 2025-12-03 14:43:24.831599368 +0000 UTC m=+0.061154635 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.906 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.906 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:43:24 np0005544118 nova_compute[187283]: 2025-12-03 14:43:24.906 187287 DEBUG nova.network.neutron [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:43:25 np0005544118 nova_compute[187283]: 2025-12-03 14:43:25.287 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.135 187287 DEBUG nova.network.neutron [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Updating instance_info_cache with network_info: [{"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.567 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.593 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.594 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.594 187287 DEBUG oslo_concurrency.lockutils [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:26 np0005544118 nova_compute[187283]: 2025-12-03 14:43:26.598 187287 INFO nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:43:26 np0005544118 virtqemud[186958]: Domain id=20 name='instance-00000017' uuid=10be4230-b79b-4485-8ff7-2d1fba98d5ab is tainted: custom-monitor
Dec  3 09:43:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:27.093 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:27 np0005544118 nova_compute[187283]: 2025-12-03 14:43:27.605 187287 INFO nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:43:27 np0005544118 nova_compute[187283]: 2025-12-03 14:43:27.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:27 np0005544118 nova_compute[187283]: 2025-12-03 14:43:27.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:28 np0005544118 nova_compute[187283]: 2025-12-03 14:43:28.611 187287 INFO nova.virt.libvirt.driver [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:43:28 np0005544118 nova_compute[187283]: 2025-12-03 14:43:28.616 187287 DEBUG nova.compute.manager [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:43:28 np0005544118 nova_compute[187283]: 2025-12-03 14:43:28.728 187287 DEBUG nova.objects.instance [None req-bdc1ec44-b5b9-452f-8882-c1381d756475 b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:43:29 np0005544118 nova_compute[187283]: 2025-12-03 14:43:29.498 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:29 np0005544118 nova_compute[187283]: 2025-12-03 14:43:29.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:30 np0005544118 nova_compute[187283]: 2025-12-03 14:43:30.288 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:31 np0005544118 nova_compute[187283]: 2025-12-03 14:43:31.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:31 np0005544118 nova_compute[187283]: 2025-12-03 14:43:31.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:43:31 np0005544118 nova_compute[187283]: 2025-12-03 14:43:31.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:43:32 np0005544118 nova_compute[187283]: 2025-12-03 14:43:32.753 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:43:32 np0005544118 nova_compute[187283]: 2025-12-03 14:43:32.754 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:43:32 np0005544118 nova_compute[187283]: 2025-12-03 14:43:32.754 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:43:32 np0005544118 nova_compute[187283]: 2025-12-03 14:43:32.755 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 10be4230-b79b-4485-8ff7-2d1fba98d5ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.251 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.252 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.252 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.252 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.252 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.253 187287 INFO nova.compute.manager [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Terminating instance#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.254 187287 DEBUG nova.compute.manager [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:43:33 np0005544118 kernel: tapa5b08d51-ec (unregistering): left promiscuous mode
Dec  3 09:43:33 np0005544118 NetworkManager[55710]: <info>  [1764773013.2919] device (tapa5b08d51-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.298 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:33Z|00216|binding|INFO|Releasing lport a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 from this chassis (sb_readonly=0)
Dec  3 09:43:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:33Z|00217|binding|INFO|Setting lport a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 down in Southbound
Dec  3 09:43:33 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:33Z|00218|binding|INFO|Removing iface tapa5b08d51-ec ovn-installed in OVS
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.301 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.306 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:f1:73 10.100.0.5'], port_security=['fa:16:3e:38:f1:73 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c5ae593f-cc24-468c-a57d-0e9e8db00913', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.309 104491 INFO neutron.agent.ovn.metadata.agent [-] Port a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.310 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 267c5b5d-1150-48df-8bea-7890da55de3f#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.311 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.334 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3c63b8-1d34-4191-b802-62979ad04abc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec  3 09:43:33 np0005544118 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000018.scope: Consumed 15.959s CPU time.
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.372 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[387001b3-1913-4b93-8a73-2d7d3a58e2bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 systemd-machined[153602]: Machine qemu-19-instance-00000018 terminated.
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.376 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[e09cc322-139f-425f-acde-9a05a0929ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 podman[217996]: 2025-12-03 14:43:33.385838405 +0000 UTC m=+0.062746068 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.409 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0442ff42-9b5d-422b-8ded-733f4aa85e94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.431 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f832df-d2e6-4d64-b79e-fa874bf14d15]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap267c5b5d-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fe:71:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510254, 'reachable_time': 17633, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218024, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.447 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[394612db-b353-42e4-866e-e85b619219db]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510266, 'tstamp': 510266}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218025, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap267c5b5d-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 510268, 'tstamp': 510268}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218025, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.449 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.450 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.455 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.456 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap267c5b5d-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.456 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.456 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap267c5b5d-10, col_values=(('external_ids', {'iface-id': 'f97a1eb5-3e39-4381-b113-959844eaa3b1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:33.457 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.476 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.481 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.517 187287 INFO nova.virt.libvirt.driver [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Instance destroyed successfully.#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.518 187287 DEBUG nova.objects.instance [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid c5ae593f-cc24-468c-a57d-0e9e8db00913 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.522 187287 DEBUG nova.compute.manager [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-unplugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.522 187287 DEBUG oslo_concurrency.lockutils [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.522 187287 DEBUG oslo_concurrency.lockutils [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.522 187287 DEBUG oslo_concurrency.lockutils [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.523 187287 DEBUG nova.compute.manager [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] No waiting events found dispatching network-vif-unplugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.523 187287 DEBUG nova.compute.manager [req-5453da53-6135-46db-bf10-9bfb25d9028e req-fdb3587f-ea55-4c96-bd8b-741645452d44 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-unplugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.532 187287 DEBUG nova.virt.libvirt.vif [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:42:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1778087786',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1778087786',id=24,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:42:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-g25qkqlb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:42:28Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=c5ae593f-cc24-468c-a57d-0e9e8db00913,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.533 187287 DEBUG nova.network.os_vif_util [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "address": "fa:16:3e:38:f1:73", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa5b08d51-ec", "ovs_interfaceid": "a5b08d51-ec89-45e6-9f3c-c9fb406cfd42", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.533 187287 DEBUG nova.network.os_vif_util [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.534 187287 DEBUG os_vif [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.535 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.536 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa5b08d51-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.537 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.538 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.541 187287 INFO os_vif [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:f1:73,bridge_name='br-int',has_traffic_filtering=True,id=a5b08d51-ec89-45e6-9f3c-c9fb406cfd42,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa5b08d51-ec')#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.541 187287 INFO nova.virt.libvirt.driver [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Deleting instance files /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913_del#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.542 187287 INFO nova.virt.libvirt.driver [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Deletion of /var/lib/nova/instances/c5ae593f-cc24-468c-a57d-0e9e8db00913_del complete#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.586 187287 INFO nova.compute.manager [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.586 187287 DEBUG oslo.service.loopingcall [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.587 187287 DEBUG nova.compute.manager [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.587 187287 DEBUG nova.network.neutron [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.818 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Updating instance_info_cache with network_info: [{"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-10be4230-b79b-4485-8ff7-2d1fba98d5ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.833 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.834 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.854 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.855 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.855 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.855 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.919 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.976 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:33 np0005544118 nova_compute[187283]: 2025-12-03 14:43:33.978 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.030 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.162 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.164 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5696MB free_disk=73.30478286743164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.164 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.165 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.247 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance c5ae593f-cc24-468c-a57d-0e9e8db00913 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.247 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 10be4230-b79b-4485-8ff7-2d1fba98d5ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.248 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.248 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.334 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.346 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.369 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:43:34 np0005544118 nova_compute[187283]: 2025-12-03 14:43:34.369 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.205s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.291 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.611 187287 DEBUG nova.compute.manager [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.611 187287 DEBUG oslo_concurrency.lockutils [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.612 187287 DEBUG oslo_concurrency.lockutils [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.612 187287 DEBUG oslo_concurrency.lockutils [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.612 187287 DEBUG nova.compute.manager [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] No waiting events found dispatching network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.613 187287 WARNING nova.compute.manager [req-89af9f55-9ac6-4f18-aa6d-bb6dcb348133 req-8e450640-523c-42b3-80c4-dd321c8ffe9f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received unexpected event network-vif-plugged-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 for instance with vm_state active and task_state deleting.#033[00m
Dec  3 09:43:35 np0005544118 podman[197639]: time="2025-12-03T14:43:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:43:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:43:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:43:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:43:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3061 "" "Go-http-client/1.1"
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.793 187287 DEBUG nova.network.neutron [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.818 187287 INFO nova.compute.manager [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Took 2.23 seconds to deallocate network for instance.#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.875 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.875 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.972 187287 DEBUG nova.compute.provider_tree [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:43:35 np0005544118 nova_compute[187283]: 2025-12-03 14:43:35.987 187287 DEBUG nova.scheduler.client.report [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:43:36 np0005544118 podman[218050]: 2025-12-03 14:43:36.000144959 +0000 UTC m=+0.074998741 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:43:36 np0005544118 nova_compute[187283]: 2025-12-03 14:43:36.009 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:36 np0005544118 nova_compute[187283]: 2025-12-03 14:43:36.032 187287 INFO nova.scheduler.client.report [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance c5ae593f-cc24-468c-a57d-0e9e8db00913#033[00m
Dec  3 09:43:36 np0005544118 nova_compute[187283]: 2025-12-03 14:43:36.139 187287 DEBUG oslo_concurrency.lockutils [None req-c4e5f314-ff23-4fb4-b6c4-df4841163cfe 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "c5ae593f-cc24-468c-a57d-0e9e8db00913" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.888s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.046 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.047 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.048 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.048 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.048 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.050 187287 INFO nova.compute.manager [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Terminating instance#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.051 187287 DEBUG nova.compute.manager [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:43:37 np0005544118 kernel: tap3e4e961e-04 (unregistering): left promiscuous mode
Dec  3 09:43:37 np0005544118 NetworkManager[55710]: <info>  [1764773017.0757] device (tap3e4e961e-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:43:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:37Z|00219|binding|INFO|Releasing lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 from this chassis (sb_readonly=0)
Dec  3 09:43:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:37Z|00220|binding|INFO|Setting lport 3e4e961e-0428-44c0-a5bd-2bcd08106c85 down in Southbound
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.083 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:43:37Z|00221|binding|INFO|Removing iface tap3e4e961e-04 ovn-installed in OVS
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.085 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.091 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:64:70 10.100.0.4'], port_security=['fa:16:3e:76:64:70 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '10be4230-b79b-4485-8ff7-2d1fba98d5ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-267c5b5d-1150-48df-8bea-7890da55de3f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '90982dffa8cd42a2b3280941bc8b991e', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'f9cb5a57-6453-4f41-b617-15214005aca5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f95fa6eb-4161-4547-961a-85a1f4412484, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=3e4e961e-0428-44c0-a5bd-2bcd08106c85) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.092 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 3e4e961e-0428-44c0-a5bd-2bcd08106c85 in datapath 267c5b5d-1150-48df-8bea-7890da55de3f unbound from our chassis#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.092 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 267c5b5d-1150-48df-8bea-7890da55de3f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.093 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[854c691d-9d6c-4854-bb5d-8d1cd82c04d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.094 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f namespace which is not needed anymore#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.103 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000017.scope: Deactivated successfully.
Dec  3 09:43:37 np0005544118 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000017.scope: Consumed 1.703s CPU time.
Dec  3 09:43:37 np0005544118 systemd-machined[153602]: Machine qemu-20-instance-00000017 terminated.
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [NOTICE]   (217645) : haproxy version is 2.8.14-c23fe91
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [NOTICE]   (217645) : path to executable is /usr/sbin/haproxy
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [WARNING]  (217645) : Exiting Master process...
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [WARNING]  (217645) : Exiting Master process...
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [ALERT]    (217645) : Current worker (217647) exited with code 143 (Terminated)
Dec  3 09:43:37 np0005544118 neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f[217641]: [WARNING]  (217645) : All workers exited. Exiting... (0)
Dec  3 09:43:37 np0005544118 systemd[1]: libpod-aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37.scope: Deactivated successfully.
Dec  3 09:43:37 np0005544118 podman[218098]: 2025-12-03 14:43:37.23076281 +0000 UTC m=+0.043455893 container died aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:43:37 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37-userdata-shm.mount: Deactivated successfully.
Dec  3 09:43:37 np0005544118 systemd[1]: var-lib-containers-storage-overlay-6affa18dc932826467200d4b817b8d25532ad9d58d402b889d00945c78302ae5-merged.mount: Deactivated successfully.
Dec  3 09:43:37 np0005544118 podman[218098]: 2025-12-03 14:43:37.271271831 +0000 UTC m=+0.083964914 container cleanup aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:43:37 np0005544118 systemd[1]: libpod-conmon-aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37.scope: Deactivated successfully.
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.309 187287 INFO nova.virt.libvirt.driver [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Instance destroyed successfully.#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.309 187287 DEBUG nova.objects.instance [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lazy-loading 'resources' on Instance uuid 10be4230-b79b-4485-8ff7-2d1fba98d5ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.326 187287 DEBUG nova.virt.libvirt.vif [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:42:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-944143149',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-944143149',id=23,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:42:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='90982dffa8cd42a2b3280941bc8b991e',ramdisk_id='',reservation_id='r-b5vvgenk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-270472559',owner_user_name='tempest-TestExecuteStrategies-270472559-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:43:29Z,user_data=None,user_id='050e065161ed4e4dbf457584926aae78',uuid=10be4230-b79b-4485-8ff7-2d1fba98d5ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.326 187287 DEBUG nova.network.os_vif_util [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converting VIF {"id": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "address": "fa:16:3e:76:64:70", "network": {"id": "267c5b5d-1150-48df-8bea-7890da55de3f", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1866364698-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "90982dffa8cd42a2b3280941bc8b991e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e4e961e-04", "ovs_interfaceid": "3e4e961e-0428-44c0-a5bd-2bcd08106c85", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.327 187287 DEBUG nova.network.os_vif_util [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.327 187287 DEBUG os_vif [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.329 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.329 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e4e961e-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.330 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.332 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.336 187287 INFO os_vif [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:76:64:70,bridge_name='br-int',has_traffic_filtering=True,id=3e4e961e-0428-44c0-a5bd-2bcd08106c85,network=Network(267c5b5d-1150-48df-8bea-7890da55de3f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e4e961e-04')#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.336 187287 INFO nova.virt.libvirt.driver [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Deleting instance files /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab_del#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.337 187287 INFO nova.virt.libvirt.driver [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Deletion of /var/lib/nova/instances/10be4230-b79b-4485-8ff7-2d1fba98d5ab_del complete#033[00m
Dec  3 09:43:37 np0005544118 podman[218134]: 2025-12-03 14:43:37.341561633 +0000 UTC m=+0.048809469 container remove aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.346 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[10334ee3-9c22-4936-9862-950ce3d2d224]: (4, ('Wed Dec  3 02:43:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37)\naee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37\nWed Dec  3 02:43:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f (aee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37)\naee92c0b6c19d84b72448a3cdf7ee840f4f9b5826c3249878046123be157db37\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.348 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b526bcf6-3a10-4f09-9726-20b47529d22b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.348 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap267c5b5d-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.350 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 kernel: tap267c5b5d-10: left promiscuous mode
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.362 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.365 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3a2e7a-d178-43b6-b464-359c8de5b7ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.380 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fcefb937-ef1b-41ea-a940-ee4579e08645]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.382 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[60cb2d9b-b35e-4cd9-a3a6-0d9c2afda430]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.386 187287 INFO nova.compute.manager [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.386 187287 DEBUG oslo.service.loopingcall [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.387 187287 DEBUG nova.compute.manager [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.388 187287 DEBUG nova.network.neutron [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.400 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[92d7f40c-ef34-4bfa-85d1-69753ce33c3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 510247, 'reachable_time': 23374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218159, 'error': None, 'target': 'ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 systemd[1]: run-netns-ovnmeta\x2d267c5b5d\x2d1150\x2d48df\x2d8bea\x2d7890da55de3f.mount: Deactivated successfully.
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.405 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-267c5b5d-1150-48df-8bea-7890da55de3f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:43:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:43:37.405 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[97a9ba32-cdbf-4a24-8c9a-00e61869e18c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:43:37 np0005544118 nova_compute[187283]: 2025-12-03 14:43:37.700 187287 DEBUG nova.compute.manager [req-a05fa11d-f561-4d40-b9c7-f92a9b71dd4d req-2e47cd2c-103e-49f7-8bc7-fd5b6d2c417d c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Received event network-vif-deleted-a5b08d51-ec89-45e6-9f3c-c9fb406cfd42 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.364 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.488 187287 DEBUG nova.network.neutron [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.505 187287 INFO nova.compute.manager [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Took 1.12 seconds to deallocate network for instance.#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.565 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.565 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.614 187287 DEBUG nova.compute.provider_tree [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.641 187287 DEBUG nova.scheduler.client.report [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.661 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.096s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.685 187287 INFO nova.scheduler.client.report [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Deleted allocations for instance 10be4230-b79b-4485-8ff7-2d1fba98d5ab#033[00m
Dec  3 09:43:38 np0005544118 nova_compute[187283]: 2025-12-03 14:43:38.744 187287 DEBUG oslo_concurrency.lockutils [None req-4fdd6ac4-93b8-4931-ac08-c9cf5fcb9b4d 050e065161ed4e4dbf457584926aae78 90982dffa8cd42a2b3280941bc8b991e - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.619 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.620 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.805 187287 DEBUG nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Received event network-vif-unplugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.806 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.806 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.807 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.807 187287 DEBUG nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] No waiting events found dispatching network-vif-unplugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.808 187287 WARNING nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Received unexpected event network-vif-unplugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.808 187287 DEBUG nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Received event network-vif-deleted-3e4e961e-0428-44c0-a5bd-2bcd08106c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.808 187287 DEBUG nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Received event network-vif-plugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.809 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.809 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.810 187287 DEBUG oslo_concurrency.lockutils [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "10be4230-b79b-4485-8ff7-2d1fba98d5ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.810 187287 DEBUG nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] No waiting events found dispatching network-vif-plugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:43:39 np0005544118 nova_compute[187283]: 2025-12-03 14:43:39.811 187287 WARNING nova.compute.manager [req-011bc027-fc1a-41d0-a135-437cf696f346 req-2f5332f9-6462-45f0-b872-4dd2a3338acb c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Received unexpected event network-vif-plugged-3e4e961e-0428-44c0-a5bd-2bcd08106c85 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:43:40 np0005544118 nova_compute[187283]: 2025-12-03 14:43:40.293 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:42 np0005544118 nova_compute[187283]: 2025-12-03 14:43:42.330 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:42 np0005544118 podman[218160]: 2025-12-03 14:43:42.868488313 +0000 UTC m=+0.101134801 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:43:45 np0005544118 nova_compute[187283]: 2025-12-03 14:43:45.295 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:47 np0005544118 nova_compute[187283]: 2025-12-03 14:43:47.333 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:48 np0005544118 nova_compute[187283]: 2025-12-03 14:43:48.517 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773013.5160887, c5ae593f-cc24-468c-a57d-0e9e8db00913 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:43:48 np0005544118 nova_compute[187283]: 2025-12-03 14:43:48.518 187287 INFO nova.compute.manager [-] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:43:48 np0005544118 nova_compute[187283]: 2025-12-03 14:43:48.540 187287 DEBUG nova.compute.manager [None req-6d36b32c-68fd-4db2-a1ac-5d2b8cd41130 - - - - - -] [instance: c5ae593f-cc24-468c-a57d-0e9e8db00913] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:43:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:43:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:43:50 np0005544118 nova_compute[187283]: 2025-12-03 14:43:50.297 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:51 np0005544118 podman[218185]: 2025-12-03 14:43:51.823176193 +0000 UTC m=+0.052345626 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:43:52 np0005544118 nova_compute[187283]: 2025-12-03 14:43:52.306 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773017.3053417, 10be4230-b79b-4485-8ff7-2d1fba98d5ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:43:52 np0005544118 nova_compute[187283]: 2025-12-03 14:43:52.306 187287 INFO nova.compute.manager [-] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:43:52 np0005544118 nova_compute[187283]: 2025-12-03 14:43:52.332 187287 DEBUG nova.compute.manager [None req-824b33a1-dd2f-4f29-98d8-b9d156f2622b - - - - - -] [instance: 10be4230-b79b-4485-8ff7-2d1fba98d5ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:43:52 np0005544118 nova_compute[187283]: 2025-12-03 14:43:52.334 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:55 np0005544118 nova_compute[187283]: 2025-12-03 14:43:55.299 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:43:55 np0005544118 podman[218206]: 2025-12-03 14:43:55.832340683 +0000 UTC m=+0.057588897 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:43:57 np0005544118 nova_compute[187283]: 2025-12-03 14:43:57.336 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:00 np0005544118 nova_compute[187283]: 2025-12-03 14:44:00.302 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:00.982 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:44:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:00.983 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:44:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:00.983 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:44:02 np0005544118 nova_compute[187283]: 2025-12-03 14:44:02.337 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:03 np0005544118 podman[218226]: 2025-12-03 14:44:03.817251115 +0000 UTC m=+0.050280368 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec  3 09:44:05 np0005544118 nova_compute[187283]: 2025-12-03 14:44:05.304 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:05 np0005544118 podman[197639]: time="2025-12-03T14:44:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:44:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:44:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:44:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:44:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Dec  3 09:44:06 np0005544118 podman[218247]: 2025-12-03 14:44:06.807126823 +0000 UTC m=+0.045103488 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:44:07 np0005544118 nova_compute[187283]: 2025-12-03 14:44:07.339 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:08 np0005544118 ovn_controller[95637]: 2025-12-03T14:44:08Z|00222|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Dec  3 09:44:10 np0005544118 nova_compute[187283]: 2025-12-03 14:44:10.308 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:12 np0005544118 nova_compute[187283]: 2025-12-03 14:44:12.342 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:13 np0005544118 nova_compute[187283]: 2025-12-03 14:44:13.782 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:13 np0005544118 podman[218272]: 2025-12-03 14:44:13.860696245 +0000 UTC m=+0.080289524 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:44:15 np0005544118 nova_compute[187283]: 2025-12-03 14:44:15.309 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:17 np0005544118 nova_compute[187283]: 2025-12-03 14:44:17.343 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:44:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:44:20 np0005544118 nova_compute[187283]: 2025-12-03 14:44:20.312 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:22 np0005544118 nova_compute[187283]: 2025-12-03 14:44:22.346 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:22 np0005544118 nova_compute[187283]: 2025-12-03 14:44:22.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:22 np0005544118 podman[218299]: 2025-12-03 14:44:22.818359476 +0000 UTC m=+0.052359126 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Dec  3 09:44:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:22.968 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:44:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:22.969 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:44:22 np0005544118 nova_compute[187283]: 2025-12-03 14:44:22.969 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:24 np0005544118 nova_compute[187283]: 2025-12-03 14:44:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:25 np0005544118 nova_compute[187283]: 2025-12-03 14:44:25.313 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:26 np0005544118 podman[218321]: 2025-12-03 14:44:26.836390748 +0000 UTC m=+0.068472493 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:44:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:44:26.971 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:44:27 np0005544118 nova_compute[187283]: 2025-12-03 14:44:27.393 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:28 np0005544118 nova_compute[187283]: 2025-12-03 14:44:28.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:29 np0005544118 nova_compute[187283]: 2025-12-03 14:44:29.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:29 np0005544118 nova_compute[187283]: 2025-12-03 14:44:29.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:30 np0005544118 nova_compute[187283]: 2025-12-03 14:44:30.320 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:32 np0005544118 nova_compute[187283]: 2025-12-03 14:44:32.394 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.646 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.646 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.673 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.674 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.674 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.674 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.824 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.825 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5886MB free_disk=73.33393859863281GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.826 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.826 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.908 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.908 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.933 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.972 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.972 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:44:33 np0005544118 nova_compute[187283]: 2025-12-03 14:44:33.995 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:44:34 np0005544118 nova_compute[187283]: 2025-12-03 14:44:34.017 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:44:34 np0005544118 nova_compute[187283]: 2025-12-03 14:44:34.037 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:44:34 np0005544118 nova_compute[187283]: 2025-12-03 14:44:34.054 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:44:34 np0005544118 nova_compute[187283]: 2025-12-03 14:44:34.085 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:44:34 np0005544118 nova_compute[187283]: 2025-12-03 14:44:34.086 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:44:34 np0005544118 podman[218343]: 2025-12-03 14:44:34.811367351 +0000 UTC m=+0.044361657 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:44:35 np0005544118 nova_compute[187283]: 2025-12-03 14:44:35.449 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:35 np0005544118 podman[197639]: time="2025-12-03T14:44:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:44:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:44:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:44:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:44:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2599 "" "Go-http-client/1.1"
Dec  3 09:44:37 np0005544118 nova_compute[187283]: 2025-12-03 14:44:37.396 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:37 np0005544118 podman[218362]: 2025-12-03 14:44:37.823358201 +0000 UTC m=+0.051141673 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:44:38 np0005544118 nova_compute[187283]: 2025-12-03 14:44:38.081 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:40 np0005544118 nova_compute[187283]: 2025-12-03 14:44:40.368 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:41 np0005544118 nova_compute[187283]: 2025-12-03 14:44:41.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:44:41 np0005544118 nova_compute[187283]: 2025-12-03 14:44:41.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:44:42 np0005544118 nova_compute[187283]: 2025-12-03 14:44:42.397 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:44 np0005544118 podman[218386]: 2025-12-03 14:44:44.85365786 +0000 UTC m=+0.087881471 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:44:45 np0005544118 nova_compute[187283]: 2025-12-03 14:44:45.369 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:47 np0005544118 nova_compute[187283]: 2025-12-03 14:44:47.398 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:44:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:44:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:44:50 np0005544118 nova_compute[187283]: 2025-12-03 14:44:50.369 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:52 np0005544118 nova_compute[187283]: 2025-12-03 14:44:52.400 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:53 np0005544118 ovn_controller[95637]: 2025-12-03T14:44:53Z|00223|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:44:53 np0005544118 podman[218413]: 2025-12-03 14:44:53.835450685 +0000 UTC m=+0.064300119 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter)
Dec  3 09:44:55 np0005544118 nova_compute[187283]: 2025-12-03 14:44:55.380 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:57 np0005544118 nova_compute[187283]: 2025-12-03 14:44:57.401 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:44:57 np0005544118 podman[218434]: 2025-12-03 14:44:57.858871225 +0000 UTC m=+0.082764382 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:45:00 np0005544118 nova_compute[187283]: 2025-12-03 14:45:00.431 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:00.984 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:00.985 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:00.986 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:02 np0005544118 nova_compute[187283]: 2025-12-03 14:45:02.402 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.066 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.067 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.084 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.152 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.152 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.158 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.158 187287 INFO nova.compute.claims [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.255 187287 DEBUG nova.compute.provider_tree [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.269 187287 DEBUG nova.scheduler.client.report [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.286 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.287 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.328 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.328 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:45:05 np0005544118 podman[218456]: 2025-12-03 14:45:05.330380335 +0000 UTC m=+0.044872257 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.344 187287 INFO nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.369 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.433 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.460 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.462 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.462 187287 INFO nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Creating image(s)#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.463 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.463 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.464 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.476 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.495 187287 DEBUG nova.policy [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef71ac78a3c14698845fdb4e5991acf4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '31c06166ed7946108a60a70c4f424899', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.539 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.540 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.541 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.553 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.614 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.615 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:05 np0005544118 podman[197639]: time="2025-12-03T14:45:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:45:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:45:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:45:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:45:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.648 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.649 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.650 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.704 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.705 187287 DEBUG nova.virt.disk.api [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Checking if we can resize image /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.706 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.760 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.761 187287 DEBUG nova.virt.disk.api [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Cannot resize image /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.762 187287 DEBUG nova.objects.instance [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lazy-loading 'migration_context' on Instance uuid 1aa3b883-c6d3-427a-981b-001724e618c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.785 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.785 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Ensure instance console log exists: /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.786 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.786 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:05 np0005544118 nova_compute[187283]: 2025-12-03 14:45:05.786 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:07 np0005544118 nova_compute[187283]: 2025-12-03 14:45:07.402 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:08 np0005544118 nova_compute[187283]: 2025-12-03 14:45:08.696 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Successfully created port: 86428e20-27e4-4c02-9dfc-e15d45e5c8cf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:45:08 np0005544118 podman[218490]: 2025-12-03 14:45:08.814640358 +0000 UTC m=+0.051826547 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.434 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.798 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Successfully updated port: 86428e20-27e4-4c02-9dfc-e15d45e5c8cf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.814 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.814 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquired lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.814 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.890 187287 DEBUG nova.compute.manager [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-changed-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.890 187287 DEBUG nova.compute.manager [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Refreshing instance network info cache due to event network-changed-86428e20-27e4-4c02-9dfc-e15d45e5c8cf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.891 187287 DEBUG oslo_concurrency.lockutils [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:45:10 np0005544118 nova_compute[187283]: 2025-12-03 14:45:10.947 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.403 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.848 187287 DEBUG nova.network.neutron [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updating instance_info_cache with network_info: [{"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.873 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Releasing lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.874 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Instance network_info: |[{"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.874 187287 DEBUG oslo_concurrency.lockutils [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.875 187287 DEBUG nova.network.neutron [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Refreshing network info cache for port 86428e20-27e4-4c02-9dfc-e15d45e5c8cf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.877 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Start _get_guest_xml network_info=[{"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.882 187287 WARNING nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.888 187287 DEBUG nova.virt.libvirt.host [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.889 187287 DEBUG nova.virt.libvirt.host [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.893 187287 DEBUG nova.virt.libvirt.host [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.894 187287 DEBUG nova.virt.libvirt.host [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.895 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.896 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.896 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.896 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.896 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.897 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.897 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.897 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.897 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.897 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.898 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.898 187287 DEBUG nova.virt.hardware [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.902 187287 DEBUG nova.virt.libvirt.vif [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:45:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-787643382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-787643382',id=26,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31c06166ed7946108a60a70c4f424899',ramdisk_id='',reservation_id='r-2hov40xh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:45:05Z,user_data=None,user_id='ef71ac78a3c14698845fdb4e5991acf4',uuid=1aa3b883-c6d3-427a-981b-001724e618c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.902 187287 DEBUG nova.network.os_vif_util [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converting VIF {"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.903 187287 DEBUG nova.network.os_vif_util [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.904 187287 DEBUG nova.objects.instance [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1aa3b883-c6d3-427a-981b-001724e618c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.923 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <uuid>1aa3b883-c6d3-427a-981b-001724e618c9</uuid>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <name>instance-0000001a</name>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-787643382</nova:name>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:45:12</nova:creationTime>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:user uuid="ef71ac78a3c14698845fdb4e5991acf4">tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member</nova:user>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:project uuid="31c06166ed7946108a60a70c4f424899">tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755</nova:project>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        <nova:port uuid="86428e20-27e4-4c02-9dfc-e15d45e5c8cf">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="serial">1aa3b883-c6d3-427a-981b-001724e618c9</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="uuid">1aa3b883-c6d3-427a-981b-001724e618c9</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.config"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:84:86:97"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <target dev="tap86428e20-27"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/console.log" append="off"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:45:12 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:45:12 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:45:12 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:45:12 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.924 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Preparing to wait for external event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.924 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.924 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.925 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.925 187287 DEBUG nova.virt.libvirt.vif [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:45:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-787643382',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-787643382',id=26,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='31c06166ed7946108a60a70c4f424899',ramdisk_id='',reservation_id='r-2hov40xh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:45:05Z,user_data=None,user_id='ef71ac78a3c14698845fdb4e5991acf4',uuid=1aa3b883-c6d3-427a-981b-001724e618c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.926 187287 DEBUG nova.network.os_vif_util [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converting VIF {"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.926 187287 DEBUG nova.network.os_vif_util [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.926 187287 DEBUG os_vif [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.927 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.927 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.928 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.932 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.932 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap86428e20-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.933 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap86428e20-27, col_values=(('external_ids', {'iface-id': '86428e20-27e4-4c02-9dfc-e15d45e5c8cf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:86:97', 'vm-uuid': '1aa3b883-c6d3-427a-981b-001724e618c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.934 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.936 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:45:12 np0005544118 NetworkManager[55710]: <info>  [1764773112.9370] manager: (tap86428e20-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.941 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.942 187287 INFO os_vif [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27')#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.996 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.997 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.997 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] No VIF found with MAC fa:16:3e:84:86:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:45:12 np0005544118 nova_compute[187283]: 2025-12-03 14:45:12.998 187287 INFO nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Using config drive#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.369 187287 INFO nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Creating config drive at /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.config#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.375 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt4m5tbbp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.515 187287 DEBUG oslo_concurrency.processutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpt4m5tbbp" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:13 np0005544118 kernel: tap86428e20-27: entered promiscuous mode
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.5801] manager: (tap86428e20-27): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.580 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:13Z|00224|binding|INFO|Claiming lport 86428e20-27e4-4c02-9dfc-e15d45e5c8cf for this chassis.
Dec  3 09:45:13 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:13Z|00225|binding|INFO|86428e20-27e4-4c02-9dfc-e15d45e5c8cf: Claiming fa:16:3e:84:86:97 10.100.0.4
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.585 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.589 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.602 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:86:97 10.100.0.4'], port_security=['fa:16:3e:84:86:97 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1aa3b883-c6d3-427a-981b-001724e618c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24de5d6e-2ac8-426f-9829-e0345484f333', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31c06166ed7946108a60a70c4f424899', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a86cd676-17cb-4aeb-a389-e4ff346ec635', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a997ea5-9f15-4402-8360-f2198fef78d4, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=86428e20-27e4-4c02-9dfc-e15d45e5c8cf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.603 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 86428e20-27e4-4c02-9dfc-e15d45e5c8cf in datapath 24de5d6e-2ac8-426f-9829-e0345484f333 bound to our chassis#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.604 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24de5d6e-2ac8-426f-9829-e0345484f333#033[00m
Dec  3 09:45:13 np0005544118 systemd-udevd[218529]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.615 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b295adfe-caef-4ce6-a18a-dd244a6aea21]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.616 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap24de5d6e-21 in ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:45:13 np0005544118 systemd-machined[153602]: New machine qemu-21-instance-0000001a.
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.619 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap24de5d6e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.619 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4fb3d370-1600-4eab-ba25-1ae0ec99b44f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.6211] device (tap86428e20-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.621 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9fa96a3c-ccb4-4e08-8717-76043febc41e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.6221] device (tap86428e20-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.635 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[7aaed664-2437-4cab-814e-d055c3f19c2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.653 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[287d80b4-b7de-498a-8f8f-36c2ad3b98ac]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 systemd[1]: Started Virtual Machine qemu-21-instance-0000001a.
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.656 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:13Z|00226|binding|INFO|Setting lport 86428e20-27e4-4c02-9dfc-e15d45e5c8cf ovn-installed in OVS
Dec  3 09:45:13 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:13Z|00227|binding|INFO|Setting lport 86428e20-27e4-4c02-9dfc-e15d45e5c8cf up in Southbound
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.662 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.680 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[ea06ce2b-6225-4d68-ae06-b0ed56cdeacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.686 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7316e1ca-72c2-4154-b9e7-282eef1c86dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.6869] manager: (tap24de5d6e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.716 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[196dd2e4-5648-46ad-84de-385cab524257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.720 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[423b0972-43a9-403e-bb6f-0e7bace88d6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.7427] device (tap24de5d6e-20): carrier: link connected
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.746 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[0c10c166-342a-4b84-a7ab-b5d0d2913ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.763 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[59b2a088-9f12-4198-ac24-55e200de6982]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24de5d6e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:56:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526935, 'reachable_time': 19375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218563, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.778 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f8006970-6869-4118-b573-5c36b8dd6652]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef5:56cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526935, 'tstamp': 526935}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218564, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.795 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c213f8-b9c6-4073-b002-f6288538d54e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24de5d6e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:56:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526935, 'reachable_time': 19375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218565, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.828 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[eca5b7da-a67a-4cd1-a999-0a0debe355ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.884 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[283bd69b-65bb-4301-a836-de3dcc19a6c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.886 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24de5d6e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.886 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.886 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24de5d6e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:13 np0005544118 kernel: tap24de5d6e-20: entered promiscuous mode
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.888 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 NetworkManager[55710]: <info>  [1764773113.8887] manager: (tap24de5d6e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.890 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.890 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24de5d6e-20, col_values=(('external_ids', {'iface-id': '46a398a0-ef17-4439-b26e-87c3dd2c7c58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:45:13 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:13Z|00228|binding|INFO|Releasing lport 46a398a0-ef17-4439-b26e-87c3dd2c7c58 from this chassis (sb_readonly=0)
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.892 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.902 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.903 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/24de5d6e-2ac8-426f-9829-e0345484f333.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/24de5d6e-2ac8-426f-9829-e0345484f333.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.903 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0a9556a2-a477-4ecd-8b94-3b3a96ff93e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.904 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-24de5d6e-2ac8-426f-9829-e0345484f333
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/24de5d6e-2ac8-426f-9829-e0345484f333.pid.haproxy
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 24de5d6e-2ac8-426f-9829-e0345484f333
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:45:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:45:13.905 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'env', 'PROCESS_TAG=haproxy-24de5d6e-2ac8-426f-9829-e0345484f333', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/24de5d6e-2ac8-426f-9829-e0345484f333.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.944 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773113.944098, 1aa3b883-c6d3-427a-981b-001724e618c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.945 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] VM Started (Lifecycle Event)#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.984 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.988 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773113.945115, 1aa3b883-c6d3-427a-981b-001724e618c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.988 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.991 187287 DEBUG nova.compute.manager [req-5b804e54-12f2-4dd7-a0b4-87542a411d84 req-e76fbc51-2463-4556-a1a1-307b68a2ab9b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.992 187287 DEBUG oslo_concurrency.lockutils [req-5b804e54-12f2-4dd7-a0b4-87542a411d84 req-e76fbc51-2463-4556-a1a1-307b68a2ab9b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.992 187287 DEBUG oslo_concurrency.lockutils [req-5b804e54-12f2-4dd7-a0b4-87542a411d84 req-e76fbc51-2463-4556-a1a1-307b68a2ab9b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.993 187287 DEBUG oslo_concurrency.lockutils [req-5b804e54-12f2-4dd7-a0b4-87542a411d84 req-e76fbc51-2463-4556-a1a1-307b68a2ab9b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.993 187287 DEBUG nova.compute.manager [req-5b804e54-12f2-4dd7-a0b4-87542a411d84 req-e76fbc51-2463-4556-a1a1-307b68a2ab9b c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Processing event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.994 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:45:13 np0005544118 nova_compute[187283]: 2025-12-03 14:45:13.997 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.001 187287 INFO nova.virt.libvirt.driver [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Instance spawned successfully.#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.002 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.014 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.017 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773113.9971485, 1aa3b883-c6d3-427a-981b-001724e618c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.017 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.024 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.025 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.025 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.025 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.026 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.026 187287 DEBUG nova.virt.libvirt.driver [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.034 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.037 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.067 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.094 187287 INFO nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Took 8.63 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.095 187287 DEBUG nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.147 187287 INFO nova.compute.manager [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Took 9.02 seconds to build instance.#033[00m
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.174 187287 DEBUG oslo_concurrency.lockutils [None req-b7f9336b-c122-4d7c-a5bc-a890cbdd4386 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:14 np0005544118 podman[218605]: 2025-12-03 14:45:14.296695603 +0000 UTC m=+0.062586590 container create d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:45:14 np0005544118 systemd[1]: Started libpod-conmon-d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda.scope.
Dec  3 09:45:14 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:45:14 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53c0fcb0c9e88c4c67f765881d9878fcd292a2beb0d657ec0bc31a559feea667/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:45:14 np0005544118 podman[218605]: 2025-12-03 14:45:14.263512007 +0000 UTC m=+0.029403044 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.376 187287 DEBUG nova.network.neutron [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updated VIF entry in instance network info cache for port 86428e20-27e4-4c02-9dfc-e15d45e5c8cf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:45:14 np0005544118 podman[218605]: 2025-12-03 14:45:14.377380108 +0000 UTC m=+0.143271095 container init d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.377 187287 DEBUG nova.network.neutron [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updating instance_info_cache with network_info: [{"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:45:14 np0005544118 podman[218605]: 2025-12-03 14:45:14.382301722 +0000 UTC m=+0.148192719 container start d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:45:14 np0005544118 nova_compute[187283]: 2025-12-03 14:45:14.391 187287 DEBUG oslo_concurrency.lockutils [req-35349be9-565c-4062-9637-f1ded480b855 req-bf2e1378-f856-41fc-9bb5-74b832daa65e c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:45:14 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [NOTICE]   (218624) : New worker (218626) forked
Dec  3 09:45:14 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [NOTICE]   (218624) : Loading success.
Dec  3 09:45:15 np0005544118 nova_compute[187283]: 2025-12-03 14:45:15.436 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:15 np0005544118 podman[218635]: 2025-12-03 14:45:15.889780777 +0000 UTC m=+0.111777606 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true)
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.072 187287 DEBUG nova.compute.manager [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.072 187287 DEBUG oslo_concurrency.lockutils [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.073 187287 DEBUG oslo_concurrency.lockutils [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.073 187287 DEBUG oslo_concurrency.lockutils [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.073 187287 DEBUG nova.compute.manager [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] No waiting events found dispatching network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:45:16 np0005544118 nova_compute[187283]: 2025-12-03 14:45:16.073 187287 WARNING nova.compute.manager [req-8765023c-61bc-4581-b3b4-7e9b74def395 req-b44fd52e-bbe0-405f-a586-d38eb670a64f c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received unexpected event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf for instance with vm_state active and task_state None.#033[00m
Dec  3 09:45:17 np0005544118 nova_compute[187283]: 2025-12-03 14:45:17.979 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:45:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:45:20 np0005544118 nova_compute[187283]: 2025-12-03 14:45:20.490 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:23 np0005544118 nova_compute[187283]: 2025-12-03 14:45:23.022 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:23 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:45:24 np0005544118 nova_compute[187283]: 2025-12-03 14:45:24.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:24 np0005544118 podman[218664]: 2025-12-03 14:45:24.823285548 +0000 UTC m=+0.056249189 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git)
Dec  3 09:45:25 np0005544118 nova_compute[187283]: 2025-12-03 14:45:25.493 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:26 np0005544118 nova_compute[187283]: 2025-12-03 14:45:26.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:26Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:84:86:97 10.100.0.4
Dec  3 09:45:26 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:26Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:84:86:97 10.100.0.4
Dec  3 09:45:28 np0005544118 nova_compute[187283]: 2025-12-03 14:45:28.082 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:28 np0005544118 podman[218702]: 2025-12-03 14:45:28.81328619 +0000 UTC m=+0.050381998 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:45:29 np0005544118 nova_compute[187283]: 2025-12-03 14:45:29.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:30 np0005544118 nova_compute[187283]: 2025-12-03 14:45:30.495 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:30 np0005544118 nova_compute[187283]: 2025-12-03 14:45:30.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:30 np0005544118 nova_compute[187283]: 2025-12-03 14:45:30.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:33 np0005544118 nova_compute[187283]: 2025-12-03 14:45:33.085 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.496 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:45:35 np0005544118 podman[197639]: time="2025-12-03T14:45:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:45:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:45:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  3 09:45:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:45:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Dec  3 09:45:35 np0005544118 podman[218724]: 2025-12-03 14:45:35.816548573 +0000 UTC m=+0.053069461 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.844 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.844 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.844 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:45:35 np0005544118 nova_compute[187283]: 2025-12-03 14:45:35.844 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1aa3b883-c6d3-427a-981b-001724e618c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.087 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.647 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updating instance_info_cache with network_info: [{"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.671 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-1aa3b883-c6d3-427a-981b-001724e618c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.671 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.672 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.694 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.695 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.696 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.696 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.785 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.851 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.852 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:45:38 np0005544118 nova_compute[187283]: 2025-12-03 14:45:38.917 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.060 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.061 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5689MB free_disk=73.3050651550293GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.061 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.062 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.145 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 1aa3b883-c6d3-427a-981b-001724e618c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.145 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.145 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.229 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.243 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.270 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:45:39 np0005544118 nova_compute[187283]: 2025-12-03 14:45:39.270 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:45:39 np0005544118 podman[218750]: 2025-12-03 14:45:39.820683341 +0000 UTC m=+0.052516156 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:45:40 np0005544118 nova_compute[187283]: 2025-12-03 14:45:40.499 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:42 np0005544118 nova_compute[187283]: 2025-12-03 14:45:42.207 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:42 np0005544118 nova_compute[187283]: 2025-12-03 14:45:42.207 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:42 np0005544118 nova_compute[187283]: 2025-12-03 14:45:42.236 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:45:42 np0005544118 nova_compute[187283]: 2025-12-03 14:45:42.237 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:45:43 np0005544118 nova_compute[187283]: 2025-12-03 14:45:43.089 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:43 np0005544118 ovn_controller[95637]: 2025-12-03T14:45:43Z|00229|memory_trim|INFO|Detected inactivity (last active 30025 ms ago): trimming memory
Dec  3 09:45:45 np0005544118 nova_compute[187283]: 2025-12-03 14:45:45.501 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:46 np0005544118 podman[218774]: 2025-12-03 14:45:46.883940845 +0000 UTC m=+0.111860768 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:45:48 np0005544118 nova_compute[187283]: 2025-12-03 14:45:48.092 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:45:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:45:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:45:50 np0005544118 nova_compute[187283]: 2025-12-03 14:45:50.502 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:53 np0005544118 nova_compute[187283]: 2025-12-03 14:45:53.095 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:55 np0005544118 nova_compute[187283]: 2025-12-03 14:45:55.503 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:55 np0005544118 podman[218803]: 2025-12-03 14:45:55.817553437 +0000 UTC m=+0.048721783 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec  3 09:45:58 np0005544118 nova_compute[187283]: 2025-12-03 14:45:58.097 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:45:59 np0005544118 podman[218825]: 2025-12-03 14:45:59.844453646 +0000 UTC m=+0.080536941 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:46:00 np0005544118 nova_compute[187283]: 2025-12-03 14:46:00.532 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:00.985 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:00.986 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:00.987 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:03 np0005544118 nova_compute[187283]: 2025-12-03 14:46:03.101 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:05 np0005544118 nova_compute[187283]: 2025-12-03 14:46:05.533 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:05 np0005544118 podman[197639]: time="2025-12-03T14:46:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:46:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:46:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  3 09:46:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:46:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3070 "" "Go-http-client/1.1"
Dec  3 09:46:06 np0005544118 podman[218845]: 2025-12-03 14:46:06.810345949 +0000 UTC m=+0.046681597 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  3 09:46:08 np0005544118 nova_compute[187283]: 2025-12-03 14:46:08.103 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:10 np0005544118 nova_compute[187283]: 2025-12-03 14:46:10.577 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:10 np0005544118 podman[218862]: 2025-12-03 14:46:10.809280245 +0000 UTC m=+0.043532131 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:46:13 np0005544118 nova_compute[187283]: 2025-12-03 14:46:13.106 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:15 np0005544118 nova_compute[187283]: 2025-12-03 14:46:15.620 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:17 np0005544118 podman[218886]: 2025-12-03 14:46:17.951140085 +0000 UTC m=+0.176275268 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 09:46:18 np0005544118 nova_compute[187283]: 2025-12-03 14:46:18.108 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:19 np0005544118 nova_compute[187283]: 2025-12-03 14:46:19.027 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Creating tmpfile /var/lib/nova/instances/tmpf5zx_izy to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:46:19 np0005544118 nova_compute[187283]: 2025-12-03 14:46:19.135 187287 DEBUG nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf5zx_izy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:46:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:46:20 np0005544118 nova_compute[187283]: 2025-12-03 14:46:20.349 187287 DEBUG nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf5zx_izy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='66c14c84-6e65-420e-8460-c57e94c10ad1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:46:20 np0005544118 nova_compute[187283]: 2025-12-03 14:46:20.388 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:46:20 np0005544118 nova_compute[187283]: 2025-12-03 14:46:20.389 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:46:20 np0005544118 nova_compute[187283]: 2025-12-03 14:46:20.389 187287 DEBUG nova.network.neutron [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:46:20 np0005544118 nova_compute[187283]: 2025-12-03 14:46:20.622 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.496 187287 DEBUG nova.network.neutron [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Updating instance_info_cache with network_info: [{"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.518 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.521 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf5zx_izy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='66c14c84-6e65-420e-8460-c57e94c10ad1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.522 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Creating instance directory: /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.522 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Creating disk.info with the contents: {'/var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk': 'qcow2', '/var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.523 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.524 187287 DEBUG nova.objects.instance [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 66c14c84-6e65-420e-8460-c57e94c10ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.564 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.623 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.624 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.624 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.639 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.732 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.733 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.763 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.764 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.765 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.824 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.826 187287 DEBUG nova.virt.disk.api [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.827 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.894 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.896 187287 DEBUG nova.virt.disk.api [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.896 187287 DEBUG nova.objects.instance [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 66c14c84-6e65-420e-8460-c57e94c10ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.919 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.944 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config 485376" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.947 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config to /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:46:21 np0005544118 nova_compute[187283]: 2025-12-03 14:46:21.948 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.549 187287 DEBUG oslo_concurrency.processutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk.config /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1" returned: 0 in 0.602s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.552 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.554 187287 DEBUG nova.virt.libvirt.vif [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:44:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-594785372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-594785372',id=25,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:44:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='31c06166ed7946108a60a70c4f424899',ramdisk_id='',reservation_id='r-9l5i3bkm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:44:59Z,user_data=None,user_id='ef71ac78a3c14698845fdb4e5991acf4',uuid=66c14c84-6e65-420e-8460-c57e94c10ad1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.555 187287 DEBUG nova.network.os_vif_util [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.558 187287 DEBUG nova.network.os_vif_util [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.559 187287 DEBUG os_vif [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.560 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.561 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.563 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.566 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.567 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3dc6959a-c1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.567 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3dc6959a-c1, col_values=(('external_ids', {'iface-id': '3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:82:5a:70', 'vm-uuid': '66c14c84-6e65-420e-8460-c57e94c10ad1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.569 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:22 np0005544118 NetworkManager[55710]: <info>  [1764773182.5709] manager: (tap3dc6959a-c1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.571 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.580 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.581 187287 INFO os_vif [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1')#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.582 187287 DEBUG nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:46:22 np0005544118 nova_compute[187283]: 2025-12-03 14:46:22.582 187287 DEBUG nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf5zx_izy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='66c14c84-6e65-420e-8460-c57e94c10ad1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:46:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:23.292 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:46:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:23.294 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:46:23 np0005544118 nova_compute[187283]: 2025-12-03 14:46:23.321 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:23 np0005544118 nova_compute[187283]: 2025-12-03 14:46:23.709 187287 DEBUG nova.network.neutron [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Port 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:46:23 np0005544118 nova_compute[187283]: 2025-12-03 14:46:23.711 187287 DEBUG nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpf5zx_izy',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='66c14c84-6e65-420e-8460-c57e94c10ad1',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:46:23 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:46:23 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:46:24 np0005544118 kernel: tap3dc6959a-c1: entered promiscuous mode
Dec  3 09:46:24 np0005544118 NetworkManager[55710]: <info>  [1764773184.0519] manager: (tap3dc6959a-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Dec  3 09:46:24 np0005544118 nova_compute[187283]: 2025-12-03 14:46:24.053 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:24Z|00230|binding|INFO|Claiming lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a for this additional chassis.
Dec  3 09:46:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:24Z|00231|binding|INFO|3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a: Claiming fa:16:3e:82:5a:70 10.100.0.10
Dec  3 09:46:24 np0005544118 nova_compute[187283]: 2025-12-03 14:46:24.057 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:24 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:24Z|00232|binding|INFO|Setting lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a ovn-installed in OVS
Dec  3 09:46:24 np0005544118 nova_compute[187283]: 2025-12-03 14:46:24.070 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:24 np0005544118 nova_compute[187283]: 2025-12-03 14:46:24.070 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:24 np0005544118 systemd-udevd[218966]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:46:24 np0005544118 systemd-machined[153602]: New machine qemu-22-instance-00000019.
Dec  3 09:46:24 np0005544118 NetworkManager[55710]: <info>  [1764773184.1069] device (tap3dc6959a-c1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:46:24 np0005544118 NetworkManager[55710]: <info>  [1764773184.1079] device (tap3dc6959a-c1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:46:24 np0005544118 systemd[1]: Started Virtual Machine qemu-22-instance-00000019.
Dec  3 09:46:25 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:25.297 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:25 np0005544118 nova_compute[187283]: 2025-12-03 14:46:25.479 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773185.4790416, 66c14c84-6e65-420e-8460-c57e94c10ad1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:46:25 np0005544118 nova_compute[187283]: 2025-12-03 14:46:25.480 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] VM Started (Lifecycle Event)#033[00m
Dec  3 09:46:25 np0005544118 nova_compute[187283]: 2025-12-03 14:46:25.502 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:46:25 np0005544118 nova_compute[187283]: 2025-12-03 14:46:25.664 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:25 np0005544118 nova_compute[187283]: 2025-12-03 14:46:25.734 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:26 np0005544118 nova_compute[187283]: 2025-12-03 14:46:26.404 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773186.403954, 66c14c84-6e65-420e-8460-c57e94c10ad1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:46:26 np0005544118 nova_compute[187283]: 2025-12-03 14:46:26.404 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:46:26 np0005544118 nova_compute[187283]: 2025-12-03 14:46:26.429 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:46:26 np0005544118 nova_compute[187283]: 2025-12-03 14:46:26.432 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:46:26 np0005544118 nova_compute[187283]: 2025-12-03 14:46:26.456 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:46:26 np0005544118 podman[218991]: 2025-12-03 14:46:26.843425569 +0000 UTC m=+0.065457609 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.buildah.version=1.33.7)
Dec  3 09:46:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:27Z|00233|binding|INFO|Claiming lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a for this chassis.
Dec  3 09:46:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:27Z|00234|binding|INFO|3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a: Claiming fa:16:3e:82:5a:70 10.100.0.10
Dec  3 09:46:27 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:27Z|00235|binding|INFO|Setting lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a up in Southbound
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.471 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:5a:70 10.100.0.10'], port_security=['fa:16:3e:82:5a:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66c14c84-6e65-420e-8460-c57e94c10ad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24de5d6e-2ac8-426f-9829-e0345484f333', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31c06166ed7946108a60a70c4f424899', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'a86cd676-17cb-4aeb-a389-e4ff346ec635', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a997ea5-9f15-4402-8360-f2198fef78d4, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.473 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a in datapath 24de5d6e-2ac8-426f-9829-e0345484f333 bound to our chassis#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.476 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24de5d6e-2ac8-426f-9829-e0345484f333#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.489 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[06771fb3-3f78-4ef1-b842-85fd331cd0d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.521 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[89cb824f-ee94-4be5-9cfe-45a8ebf782cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.524 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[34a72e45-0a9b-44a7-ab93-0a7eb83b8812]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.553 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[003777e1-7b71-4832-9764-bec6698145a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 nova_compute[187283]: 2025-12-03 14:46:27.569 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.570 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e3fc9dbf-d27c-42db-a679-fae6a4ea4252]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24de5d6e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:56:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 5, 'rx_bytes': 826, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526935, 'reachable_time': 19375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219018, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.584 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dd2197b7-d0f5-4dc2-b62e-3ad94d8cf7bc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24de5d6e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526946, 'tstamp': 526946}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219019, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24de5d6e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526948, 'tstamp': 526948}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219019, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.586 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24de5d6e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:27 np0005544118 nova_compute[187283]: 2025-12-03 14:46:27.587 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.588 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24de5d6e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.589 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.589 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24de5d6e-20, col_values=(('external_ids', {'iface-id': '46a398a0-ef17-4439-b26e-87c3dd2c7c58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:27 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:27.589 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:46:27 np0005544118 nova_compute[187283]: 2025-12-03 14:46:27.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:27 np0005544118 nova_compute[187283]: 2025-12-03 14:46:27.646 187287 INFO nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Post operation of migration started#033[00m
Dec  3 09:46:28 np0005544118 nova_compute[187283]: 2025-12-03 14:46:28.346 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:46:28 np0005544118 nova_compute[187283]: 2025-12-03 14:46:28.346 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:46:28 np0005544118 nova_compute[187283]: 2025-12-03 14:46:28.346 187287 DEBUG nova.network.neutron [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.181 187287 DEBUG nova.network.neutron [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Updating instance_info_cache with network_info: [{"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.209 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.227 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.228 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.228 187287 DEBUG oslo_concurrency.lockutils [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.233 187287 INFO nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:46:30 np0005544118 virtqemud[186958]: Domain id=22 name='instance-00000019' uuid=66c14c84-6e65-420e-8460-c57e94c10ad1 is tainted: custom-monitor
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:30 np0005544118 nova_compute[187283]: 2025-12-03 14:46:30.736 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:30 np0005544118 podman[219034]: 2025-12-03 14:46:30.830352839 +0000 UTC m=+0.058208262 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible)
Dec  3 09:46:31 np0005544118 nova_compute[187283]: 2025-12-03 14:46:31.241 187287 INFO nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:46:31 np0005544118 nova_compute[187283]: 2025-12-03 14:46:31.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:32 np0005544118 nova_compute[187283]: 2025-12-03 14:46:32.247 187287 INFO nova.virt.libvirt.driver [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:46:32 np0005544118 nova_compute[187283]: 2025-12-03 14:46:32.252 187287 DEBUG nova.compute.manager [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:46:32 np0005544118 nova_compute[187283]: 2025-12-03 14:46:32.275 187287 DEBUG nova.objects.instance [None req-ad2bafd8-e148-49d6-919b-a66f4602f08d b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:46:32 np0005544118 nova_compute[187283]: 2025-12-03 14:46:32.571 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:46:35 np0005544118 podman[197639]: time="2025-12-03T14:46:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:46:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:46:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18323 "" "Go-http-client/1.1"
Dec  3 09:46:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:46:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3068 "" "Go-http-client/1.1"
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.716 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.717 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.717 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.718 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.718 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.719 187287 INFO nova.compute.manager [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Terminating instance#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.720 187287 DEBUG nova.compute.manager [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.739 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 kernel: tap86428e20-27 (unregistering): left promiscuous mode
Dec  3 09:46:35 np0005544118 NetworkManager[55710]: <info>  [1764773195.7946] device (tap86428e20-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:46:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:35Z|00236|binding|INFO|Releasing lport 86428e20-27e4-4c02-9dfc-e15d45e5c8cf from this chassis (sb_readonly=0)
Dec  3 09:46:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:35Z|00237|binding|INFO|Setting lport 86428e20-27e4-4c02-9dfc-e15d45e5c8cf down in Southbound
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.803 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:35Z|00238|binding|INFO|Removing iface tap86428e20-27 ovn-installed in OVS
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.806 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.806 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquired lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.807 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.807 187287 DEBUG nova.objects.instance [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lazy-loading 'info_cache' on Instance uuid 66c14c84-6e65-420e-8460-c57e94c10ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.808 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.813 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:86:97 10.100.0.4'], port_security=['fa:16:3e:84:86:97 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '1aa3b883-c6d3-427a-981b-001724e618c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24de5d6e-2ac8-426f-9829-e0345484f333', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31c06166ed7946108a60a70c4f424899', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a86cd676-17cb-4aeb-a389-e4ff346ec635', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a997ea5-9f15-4402-8360-f2198fef78d4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=86428e20-27e4-4c02-9dfc-e15d45e5c8cf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.815 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 86428e20-27e4-4c02-9dfc-e15d45e5c8cf in datapath 24de5d6e-2ac8-426f-9829-e0345484f333 unbound from our chassis#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.816 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 24de5d6e-2ac8-426f-9829-e0345484f333#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.817 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.833 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[962fa12f-d817-4cc9-9bb8-a182c0208b00]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec  3 09:46:35 np0005544118 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001a.scope: Consumed 15.554s CPU time.
Dec  3 09:46:35 np0005544118 systemd-machined[153602]: Machine qemu-21-instance-0000001a terminated.
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.862 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2dae5c-f830-4b3a-bc6f-bdb00d52333c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.865 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[04c744ac-002d-4c3d-81d1-c5fc245e24a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.892 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[1db0e308-577e-4ee8-b3e5-5fa49227804f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.908 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c44d2dd0-79a9-417f-9a9d-c5581810a524]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap24de5d6e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f5:56:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 7, 'rx_bytes': 1456, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526935, 'reachable_time': 19375, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219067, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.925 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[29e96eb5-31f6-435f-96eb-ff794febeff9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap24de5d6e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526946, 'tstamp': 526946}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219068, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap24de5d6e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 526948, 'tstamp': 526948}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219068, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.927 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24de5d6e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.928 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 nova_compute[187283]: 2025-12-03 14:46:35.933 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.933 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap24de5d6e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.934 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.934 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap24de5d6e-20, col_values=(('external_ids', {'iface-id': '46a398a0-ef17-4439-b26e-87c3dd2c7c58'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:35 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:35.935 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.028 187287 INFO nova.virt.libvirt.driver [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Instance destroyed successfully.#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.029 187287 DEBUG nova.objects.instance [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lazy-loading 'resources' on Instance uuid 1aa3b883-c6d3-427a-981b-001724e618c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.050 187287 DEBUG nova.virt.libvirt.vif [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:45:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-787643382',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-787643382',id=26,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:45:14Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31c06166ed7946108a60a70c4f424899',ramdisk_id='',reservation_id='r-2hov40xh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:45:14Z,user_data=None,user_id='ef71ac78a3c14698845fdb4e5991acf4',uuid=1aa3b883-c6d3-427a-981b-001724e618c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.050 187287 DEBUG nova.network.os_vif_util [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converting VIF {"id": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "address": "fa:16:3e:84:86:97", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap86428e20-27", "ovs_interfaceid": "86428e20-27e4-4c02-9dfc-e15d45e5c8cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.051 187287 DEBUG nova.network.os_vif_util [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.051 187287 DEBUG os_vif [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.053 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.053 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap86428e20-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.054 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.057 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.059 187287 INFO os_vif [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:84:86:97,bridge_name='br-int',has_traffic_filtering=True,id=86428e20-27e4-4c02-9dfc-e15d45e5c8cf,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap86428e20-27')#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.060 187287 INFO nova.virt.libvirt.driver [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Deleting instance files /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9_del#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.061 187287 INFO nova.virt.libvirt.driver [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Deletion of /var/lib/nova/instances/1aa3b883-c6d3-427a-981b-001724e618c9_del complete#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.118 187287 DEBUG nova.compute.manager [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-unplugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.118 187287 DEBUG oslo_concurrency.lockutils [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.119 187287 DEBUG oslo_concurrency.lockutils [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.119 187287 DEBUG oslo_concurrency.lockutils [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.119 187287 DEBUG nova.compute.manager [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] No waiting events found dispatching network-vif-unplugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.119 187287 DEBUG nova.compute.manager [req-c42510d3-45fd-4582-8999-92d9c93c651e req-553d2a8f-1280-4bab-8a1d-6e7310149c95 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-unplugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.127 187287 INFO nova.compute.manager [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.128 187287 DEBUG oslo.service.loopingcall [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.128 187287 DEBUG nova.compute.manager [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.128 187287 DEBUG nova.network.neutron [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.828 187287 DEBUG nova.network.neutron [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.845 187287 INFO nova.compute.manager [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Took 0.72 seconds to deallocate network for instance.#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.914 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.914 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:36 np0005544118 nova_compute[187283]: 2025-12-03 14:46:36.996 187287 DEBUG nova.compute.provider_tree [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.016 187287 DEBUG nova.scheduler.client.report [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.037 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.063 187287 INFO nova.scheduler.client.report [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Deleted allocations for instance 1aa3b883-c6d3-427a-981b-001724e618c9#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.128 187287 DEBUG oslo_concurrency.lockutils [None req-e99d9b7e-40dd-4f32-a977-74209cf2fcca ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.304 187287 DEBUG nova.network.neutron [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Updating instance_info_cache with network_info: [{"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.330 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Releasing lock "refresh_cache-66c14c84-6e65-420e-8460-c57e94c10ad1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.330 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.331 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.354 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.354 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.354 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.354 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.454 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:37 np0005544118 podman[219088]: 2025-12-03 14:46:37.46421463 +0000 UTC m=+0.056224548 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.520 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.521 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.583 187287 DEBUG oslo_concurrency.processutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.669 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "66c14c84-6e65-420e-8460-c57e94c10ad1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.669 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.670 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.670 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.670 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.671 187287 INFO nova.compute.manager [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Terminating instance#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.672 187287 DEBUG nova.compute.manager [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:46:37 np0005544118 kernel: tap3dc6959a-c1 (unregistering): left promiscuous mode
Dec  3 09:46:37 np0005544118 NetworkManager[55710]: <info>  [1764773197.6934] device (tap3dc6959a-c1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:46:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:37Z|00239|binding|INFO|Releasing lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a from this chassis (sb_readonly=0)
Dec  3 09:46:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:37Z|00240|binding|INFO|Setting lport 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a down in Southbound
Dec  3 09:46:37 np0005544118 ovn_controller[95637]: 2025-12-03T14:46:37Z|00241|binding|INFO|Removing iface tap3dc6959a-c1 ovn-installed in OVS
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.697 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.702 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:82:5a:70 10.100.0.10'], port_security=['fa:16:3e:82:5a:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '66c14c84-6e65-420e-8460-c57e94c10ad1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-24de5d6e-2ac8-426f-9829-e0345484f333', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '31c06166ed7946108a60a70c4f424899', 'neutron:revision_number': '13', 'neutron:security_group_ids': 'a86cd676-17cb-4aeb-a389-e4ff346ec635', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8a997ea5-9f15-4402-8360-f2198fef78d4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.703 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a in datapath 24de5d6e-2ac8-426f-9829-e0345484f333 unbound from our chassis#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.705 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 24de5d6e-2ac8-426f-9829-e0345484f333, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.705 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[269c7f14-9d95-490c-847b-9cbddd147eb9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.706 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333 namespace which is not needed anymore#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.713 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:37 np0005544118 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000019.scope: Deactivated successfully.
Dec  3 09:46:37 np0005544118 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000019.scope: Consumed 2.293s CPU time.
Dec  3 09:46:37 np0005544118 systemd-machined[153602]: Machine qemu-22-instance-00000019 terminated.
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.773 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.775 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5704MB free_disk=73.30493927001953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.775 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.775 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [NOTICE]   (218624) : haproxy version is 2.8.14-c23fe91
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [NOTICE]   (218624) : path to executable is /usr/sbin/haproxy
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [WARNING]  (218624) : Exiting Master process...
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [WARNING]  (218624) : Exiting Master process...
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [ALERT]    (218624) : Current worker (218626) exited with code 143 (Terminated)
Dec  3 09:46:37 np0005544118 neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333[218620]: [WARNING]  (218624) : All workers exited. Exiting... (0)
Dec  3 09:46:37 np0005544118 systemd[1]: libpod-d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda.scope: Deactivated successfully.
Dec  3 09:46:37 np0005544118 podman[219138]: 2025-12-03 14:46:37.839619128 +0000 UTC m=+0.047375016 container died d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:46:37 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda-userdata-shm.mount: Deactivated successfully.
Dec  3 09:46:37 np0005544118 systemd[1]: var-lib-containers-storage-overlay-53c0fcb0c9e88c4c67f765881d9878fcd292a2beb0d657ec0bc31a559feea667-merged.mount: Deactivated successfully.
Dec  3 09:46:37 np0005544118 podman[219138]: 2025-12-03 14:46:37.876183477 +0000 UTC m=+0.083939355 container cleanup d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.878 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Instance 66c14c84-6e65-420e-8460-c57e94c10ad1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.879 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.879 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:46:37 np0005544118 NetworkManager[55710]: <info>  [1764773197.8885] manager: (tap3dc6959a-c1): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Dec  3 09:46:37 np0005544118 systemd[1]: libpod-conmon-d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda.scope: Deactivated successfully.
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.930 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.934 187287 INFO nova.virt.libvirt.driver [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Instance destroyed successfully.#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.935 187287 DEBUG nova.objects.instance [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lazy-loading 'resources' on Instance uuid 66c14c84-6e65-420e-8460-c57e94c10ad1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:46:37 np0005544118 podman[219167]: 2025-12-03 14:46:37.944179245 +0000 UTC m=+0.044174898 container remove d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.948 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[259e613f-c4d3-4906-ada7-070880579744]: (4, ('Wed Dec  3 02:46:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333 (d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda)\nd0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda\nWed Dec  3 02:46:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333 (d0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda)\nd0d0c296b8a370e683197271ff719a0ef7a596fed2eb5c26a2dd677462e93cda\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.950 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[73d24f7e-3ce5-493b-ae4c-c38191bb6efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:37 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:37.953 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap24de5d6e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.959 187287 DEBUG nova.virt.libvirt.vif [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:44:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-594785372',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-594785372',id=25,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:44:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='31c06166ed7946108a60a70c4f424899',ramdisk_id='',reservation_id='r-9l5i3bkm',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-1741413755-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:46:32Z,user_data=None,user_id='ef71ac78a3c14698845fdb4e5991acf4',uuid=66c14c84-6e65-420e-8460-c57e94c10ad1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.960 187287 DEBUG nova.network.os_vif_util [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converting VIF {"id": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "address": "fa:16:3e:82:5a:70", "network": {"id": "24de5d6e-2ac8-426f-9829-e0345484f333", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-709628661-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "31c06166ed7946108a60a70c4f424899", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3dc6959a-c1", "ovs_interfaceid": "3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.960 187287 DEBUG nova.network.os_vif_util [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.965 187287 DEBUG os_vif [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.968 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.971 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:37 np0005544118 nova_compute[187283]: 2025-12-03 14:46:37.972 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3dc6959a-c1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.010 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.010 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.014 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:38 np0005544118 kernel: tap24de5d6e-20: left promiscuous mode
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.016 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.017 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.019 187287 INFO os_vif [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:82:5a:70,bridge_name='br-int',has_traffic_filtering=True,id=3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a,network=Network(24de5d6e-2ac8-426f-9829-e0345484f333),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3dc6959a-c1')#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.020 187287 INFO nova.virt.libvirt.driver [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Deleting instance files /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1_del#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.020 187287 INFO nova.virt.libvirt.driver [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Deletion of /var/lib/nova/instances/66c14c84-6e65-420e-8460-c57e94c10ad1_del complete#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.020 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a2433f-b038-4394-886a-54eb6673dba9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.028 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.038 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a2c80386-688c-4380-9b6c-994bb09c0d49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.039 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[0f958342-14f5-42d9-b564-865f588a7beb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.056 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc56ef2f-912a-4fbf-867c-bf4fa5888f79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 526928, 'reachable_time': 33541, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219198, 'error': None, 'target': 'ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.059 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-24de5d6e-2ac8-426f-9829-e0345484f333 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:46:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:46:38.060 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[01480a55-1001-4b2d-b14c-214d9a07cf9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:46:38 np0005544118 systemd[1]: run-netns-ovnmeta\x2d24de5d6e\x2d2ac8\x2d426f\x2d9829\x2de0345484f333.mount: Deactivated successfully.
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.080 187287 INFO nova.compute.manager [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.080 187287 DEBUG oslo.service.loopingcall [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.081 187287 DEBUG nova.compute.manager [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.081 187287 DEBUG nova.network.neutron [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.227 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.228 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.228 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.228 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "1aa3b883-c6d3-427a-981b-001724e618c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.228 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] No waiting events found dispatching network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.228 187287 WARNING nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received unexpected event network-vif-plugged-86428e20-27e4-4c02-9dfc-e15d45e5c8cf for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.229 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Received event network-vif-deleted-86428e20-27e4-4c02-9dfc-e15d45e5c8cf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.229 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Received event network-vif-unplugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.229 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.229 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.230 187287 DEBUG oslo_concurrency.lockutils [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.230 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] No waiting events found dispatching network-vif-unplugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.230 187287 DEBUG nova.compute.manager [req-c1527b96-e28b-403e-96ba-d9fdc3d28ea1 req-02bfbaf1-c338-442f-83e7-3ccbe56dbe22 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Received event network-vif-unplugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:46:38 np0005544118 nova_compute[187283]: 2025-12-03 14:46:38.756 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.300 187287 DEBUG nova.compute.manager [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Received event network-vif-plugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.300 187287 DEBUG oslo_concurrency.lockutils [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.300 187287 DEBUG oslo_concurrency.lockutils [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.300 187287 DEBUG oslo_concurrency.lockutils [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.301 187287 DEBUG nova.compute.manager [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] No waiting events found dispatching network-vif-plugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.301 187287 WARNING nova.compute.manager [req-1a1122b8-3c0e-48a9-af2c-a8e749840813 req-cb4b0c81-4e5b-42c6-88a5-cd9bcd31b155 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Received unexpected event network-vif-plugged-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a for instance with vm_state active and task_state deleting.#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.742 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.751 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.937 187287 DEBUG nova.network.neutron [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:46:40 np0005544118 nova_compute[187283]: 2025-12-03 14:46:40.952 187287 INFO nova.compute.manager [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Took 2.87 seconds to deallocate network for instance.#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.010 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.010 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.069 187287 DEBUG nova.compute.provider_tree [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.085 187287 DEBUG nova.scheduler.client.report [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.124 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.150 187287 INFO nova.scheduler.client.report [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Deleted allocations for instance 66c14c84-6e65-420e-8460-c57e94c10ad1#033[00m
Dec  3 09:46:41 np0005544118 nova_compute[187283]: 2025-12-03 14:46:41.208 187287 DEBUG oslo_concurrency.lockutils [None req-c0a022ad-e0bf-4aaa-a9d0-292d3cd926c7 ef71ac78a3c14698845fdb4e5991acf4 31c06166ed7946108a60a70c4f424899 - - default default] Lock "66c14c84-6e65-420e-8460-c57e94c10ad1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:46:41 np0005544118 podman[219199]: 2025-12-03 14:46:41.823359629 +0000 UTC m=+0.054174972 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:46:42 np0005544118 nova_compute[187283]: 2025-12-03 14:46:42.388 187287 DEBUG nova.compute.manager [req-cc04b4e3-c472-4b34-ba27-e1ab2d785dcb req-5502989d-2d5c-455a-bc7c-d5547def27cd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Received event network-vif-deleted-3dc6959a-c1b0-4705-9b1f-a73e4e8dfb5a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:46:42 np0005544118 nova_compute[187283]: 2025-12-03 14:46:42.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:43 np0005544118 nova_compute[187283]: 2025-12-03 14:46:43.015 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:43 np0005544118 nova_compute[187283]: 2025-12-03 14:46:43.618 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:46:43 np0005544118 nova_compute[187283]: 2025-12-03 14:46:43.618 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:46:45 np0005544118 nova_compute[187283]: 2025-12-03 14:46:45.780 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:48 np0005544118 nova_compute[187283]: 2025-12-03 14:46:48.017 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:48 np0005544118 podman[219223]: 2025-12-03 14:46:48.848338155 +0000 UTC m=+0.082637419 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible)
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:46:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:46:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:46:50 np0005544118 nova_compute[187283]: 2025-12-03 14:46:50.828 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:51 np0005544118 nova_compute[187283]: 2025-12-03 14:46:51.027 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773196.0261016, 1aa3b883-c6d3-427a-981b-001724e618c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:46:51 np0005544118 nova_compute[187283]: 2025-12-03 14:46:51.028 187287 INFO nova.compute.manager [-] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:46:51 np0005544118 nova_compute[187283]: 2025-12-03 14:46:51.052 187287 DEBUG nova.compute.manager [None req-5063b262-403c-4574-a985-a8653b81b41a - - - - - -] [instance: 1aa3b883-c6d3-427a-981b-001724e618c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:46:52 np0005544118 nova_compute[187283]: 2025-12-03 14:46:52.929 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773197.928527, 66c14c84-6e65-420e-8460-c57e94c10ad1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:46:52 np0005544118 nova_compute[187283]: 2025-12-03 14:46:52.931 187287 INFO nova.compute.manager [-] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:46:52 np0005544118 nova_compute[187283]: 2025-12-03 14:46:52.959 187287 DEBUG nova.compute.manager [None req-65f6adb1-df9d-4493-845a-bdedf574f135 - - - - - -] [instance: 66c14c84-6e65-420e-8460-c57e94c10ad1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:46:53 np0005544118 nova_compute[187283]: 2025-12-03 14:46:53.019 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:55 np0005544118 nova_compute[187283]: 2025-12-03 14:46:55.833 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:46:57 np0005544118 podman[219250]: 2025-12-03 14:46:57.818496458 +0000 UTC m=+0.050667016 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350)
Dec  3 09:46:58 np0005544118 nova_compute[187283]: 2025-12-03 14:46:58.021 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:00 np0005544118 nova_compute[187283]: 2025-12-03 14:47:00.836 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:00.986 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:00.987 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:00.987 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:01 np0005544118 podman[219272]: 2025-12-03 14:47:01.824663532 +0000 UTC m=+0.056360601 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, managed_by=edpm_ansible)
Dec  3 09:47:03 np0005544118 nova_compute[187283]: 2025-12-03 14:47:03.023 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:05 np0005544118 podman[197639]: time="2025-12-03T14:47:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:47:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:47:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:47:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:47:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  3 09:47:05 np0005544118 nova_compute[187283]: 2025-12-03 14:47:05.868 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:07 np0005544118 podman[219293]: 2025-12-03 14:47:07.840107212 +0000 UTC m=+0.058524570 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:47:08 np0005544118 nova_compute[187283]: 2025-12-03 14:47:08.025 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:10 np0005544118 nova_compute[187283]: 2025-12-03 14:47:10.869 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:10 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:10Z|00242|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:47:12 np0005544118 podman[219313]: 2025-12-03 14:47:12.830995085 +0000 UTC m=+0.058042437 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:47:13 np0005544118 nova_compute[187283]: 2025-12-03 14:47:13.029 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:15 np0005544118 nova_compute[187283]: 2025-12-03 14:47:15.225 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:15 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:15.226 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:47:15 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:15.226 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:47:15 np0005544118 nova_compute[187283]: 2025-12-03 14:47:15.871 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:17 np0005544118 nova_compute[187283]: 2025-12-03 14:47:17.191 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:18 np0005544118 nova_compute[187283]: 2025-12-03 14:47:18.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:47:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:47:19 np0005544118 podman[219337]: 2025-12-03 14:47:19.844676624 +0000 UTC m=+0.080428449 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Dec  3 09:47:20 np0005544118 nova_compute[187283]: 2025-12-03 14:47:20.873 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:21 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:21.230 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:23 np0005544118 nova_compute[187283]: 2025-12-03 14:47:23.033 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:25 np0005544118 nova_compute[187283]: 2025-12-03 14:47:25.889 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:27 np0005544118 nova_compute[187283]: 2025-12-03 14:47:27.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:28 np0005544118 nova_compute[187283]: 2025-12-03 14:47:28.035 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:28 np0005544118 nova_compute[187283]: 2025-12-03 14:47:28.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:28 np0005544118 podman[219363]: 2025-12-03 14:47:28.827689877 +0000 UTC m=+0.058006035 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  3 09:47:30 np0005544118 nova_compute[187283]: 2025-12-03 14:47:30.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:30 np0005544118 nova_compute[187283]: 2025-12-03 14:47:30.890 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:32 np0005544118 nova_compute[187283]: 2025-12-03 14:47:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:32 np0005544118 nova_compute[187283]: 2025-12-03 14:47:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:32 np0005544118 podman[219384]: 2025-12-03 14:47:32.824412893 +0000 UTC m=+0.050993524 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0)
Dec  3 09:47:33 np0005544118 nova_compute[187283]: 2025-12-03 14:47:33.037 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:35 np0005544118 podman[197639]: time="2025-12-03T14:47:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:47:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:47:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:47:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:47:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2596 "" "Go-http-client/1.1"
Dec  3 09:47:35 np0005544118 nova_compute[187283]: 2025-12-03 14:47:35.894 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.628 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.628 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.628 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.629 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.856 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.857 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5870MB free_disk=73.33380126953125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.857 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.857 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.920 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:47:36 np0005544118 nova_compute[187283]: 2025-12-03 14:47:36.920 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:47:37 np0005544118 nova_compute[187283]: 2025-12-03 14:47:37.139 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:47:37 np0005544118 nova_compute[187283]: 2025-12-03 14:47:37.153 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:47:37 np0005544118 nova_compute[187283]: 2025-12-03 14:47:37.192 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:47:37 np0005544118 nova_compute[187283]: 2025-12-03 14:47:37.193 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:38 np0005544118 nova_compute[187283]: 2025-12-03 14:47:38.039 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:38 np0005544118 nova_compute[187283]: 2025-12-03 14:47:38.194 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:38 np0005544118 nova_compute[187283]: 2025-12-03 14:47:38.195 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:47:38 np0005544118 nova_compute[187283]: 2025-12-03 14:47:38.217 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:47:38 np0005544118 podman[219406]: 2025-12-03 14:47:38.810369348 +0000 UTC m=+0.047391767 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  3 09:47:40 np0005544118 nova_compute[187283]: 2025-12-03 14:47:40.624 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:40 np0005544118 nova_compute[187283]: 2025-12-03 14:47:40.894 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:42 np0005544118 nova_compute[187283]: 2025-12-03 14:47:42.601 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:43 np0005544118 nova_compute[187283]: 2025-12-03 14:47:43.041 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:43 np0005544118 nova_compute[187283]: 2025-12-03 14:47:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:47:43 np0005544118 nova_compute[187283]: 2025-12-03 14:47:43.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:47:43 np0005544118 podman[219426]: 2025-12-03 14:47:43.820796863 +0000 UTC m=+0.051357463 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:47:45 np0005544118 nova_compute[187283]: 2025-12-03 14:47:45.896 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:48 np0005544118 nova_compute[187283]: 2025-12-03 14:47:48.043 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:47:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:47:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:47:50 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:50Z|00243|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec  3 09:47:50 np0005544118 podman[219450]: 2025-12-03 14:47:50.845473312 +0000 UTC m=+0.083319117 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:47:50 np0005544118 nova_compute[187283]: 2025-12-03 14:47:50.946 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:53 np0005544118 nova_compute[187283]: 2025-12-03 14:47:53.045 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.173 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.174 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.198 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.262 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.262 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.269 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.270 187287 INFO nova.compute.claims [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Claim successful on node compute-1.ctlplane.example.com#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.400 187287 DEBUG nova.compute.provider_tree [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.415 187287 DEBUG nova.scheduler.client.report [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.432 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.433 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.470 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.472 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.488 187287 INFO nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.509 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.590 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.591 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.591 187287 INFO nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Creating image(s)#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.592 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.593 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.593 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.608 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.676 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.678 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.678 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.692 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.745 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.746 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.790 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.791 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.792 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.848 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.849 187287 DEBUG nova.virt.disk.api [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Checking if we can resize image /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.850 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.916 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.917 187287 DEBUG nova.virt.disk.api [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Cannot resize image /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.918 187287 DEBUG nova.objects.instance [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lazy-loading 'migration_context' on Instance uuid 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.933 187287 DEBUG nova.policy [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ce698eb5a4b84e229f3adc6e31ba044b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6ed557ae995c4dc49409420624bc5389', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.939 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.939 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Ensure instance console log exists: /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.939 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.940 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.940 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:55 np0005544118 nova_compute[187283]: 2025-12-03 14:47:55.997 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:56 np0005544118 nova_compute[187283]: 2025-12-03 14:47:56.482 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Successfully created port: 5ce00fa6-307c-470b-af47-ed25b5e0b312 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.129 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Successfully updated port: 5ce00fa6-307c-470b-af47-ed25b5e0b312 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.148 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.149 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquired lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.149 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.226 187287 DEBUG nova.compute.manager [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-changed-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.226 187287 DEBUG nova.compute.manager [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Refreshing instance network info cache due to event network-changed-5ce00fa6-307c-470b-af47-ed25b5e0b312. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.227 187287 DEBUG oslo_concurrency.lockutils [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:47:57 np0005544118 nova_compute[187283]: 2025-12-03 14:47:57.310 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.047 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.129 187287 DEBUG nova.network.neutron [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updating instance_info_cache with network_info: [{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.153 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Releasing lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.153 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Instance network_info: |[{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.153 187287 DEBUG oslo_concurrency.lockutils [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.154 187287 DEBUG nova.network.neutron [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Refreshing network info cache for port 5ce00fa6-307c-470b-af47-ed25b5e0b312 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.156 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Start _get_guest_xml network_info=[{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encrypted': False, 'boot_index': 0, 'encryption_options': None, 'device_name': '/dev/vda', 'image_id': 'c4df1e47-ea6c-486a-a6b4-60f325b44502'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.160 187287 WARNING nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.166 187287 DEBUG nova.virt.libvirt.host [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.166 187287 DEBUG nova.virt.libvirt.host [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.171 187287 DEBUG nova.virt.libvirt.host [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.172 187287 DEBUG nova.virt.libvirt.host [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.173 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.173 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-03T14:15:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ec610f84-c649-49d7-9c7a-a22befc31fb8',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-03T14:15:41Z,direct_url=<?>,disk_format='qcow2',id=c4df1e47-ea6c-486a-a6b4-60f325b44502,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='ecdb6bc2f490401f83229422485f1b7a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-03T14:15:43Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.174 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.174 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.174 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.175 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.175 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.175 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.175 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.176 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.176 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.176 187287 DEBUG nova.virt.hardware [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.180 187287 DEBUG nova.virt.libvirt.vif [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-762989702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-762989702',id=27,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed557ae995c4dc49409420624bc5389',ramdisk_id='',reservation_id='r-drhhrkiv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:47:55Z,user_data=None,user_id='ce698eb5a4b84e229f3adc6e31ba044b',uuid=0d7f4cb9-d8d7-4db9-8c74-8cf11e501319,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.180 187287 DEBUG nova.network.os_vif_util [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Converting VIF {"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.181 187287 DEBUG nova.network.os_vif_util [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.182 187287 DEBUG nova.objects.instance [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.195 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] End _get_guest_xml xml=<domain type="kvm">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <uuid>0d7f4cb9-d8d7-4db9-8c74-8cf11e501319</uuid>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <name>instance-0000001b</name>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <memory>131072</memory>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <vcpu>1</vcpu>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <metadata>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:name>tempest-TestExecuteWorkloadBalancingStrategy-server-762989702</nova:name>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:creationTime>2025-12-03 14:47:58</nova:creationTime>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:flavor name="m1.nano">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:memory>128</nova:memory>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:disk>1</nova:disk>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:swap>0</nova:swap>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:ephemeral>0</nova:ephemeral>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:vcpus>1</nova:vcpus>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      </nova:flavor>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:owner>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:user uuid="ce698eb5a4b84e229f3adc6e31ba044b">tempest-TestExecuteWorkloadBalancingStrategy-679609197-project-member</nova:user>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:project uuid="6ed557ae995c4dc49409420624bc5389">tempest-TestExecuteWorkloadBalancingStrategy-679609197</nova:project>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      </nova:owner>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:root type="image" uuid="c4df1e47-ea6c-486a-a6b4-60f325b44502"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <nova:ports>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        <nova:port uuid="5ce00fa6-307c-470b-af47-ed25b5e0b312">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:        </nova:port>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      </nova:ports>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </nova:instance>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </metadata>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <sysinfo type="smbios">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <system>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="manufacturer">RDO</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="product">OpenStack Compute</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="serial">0d7f4cb9-d8d7-4db9-8c74-8cf11e501319</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="uuid">0d7f4cb9-d8d7-4db9-8c74-8cf11e501319</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <entry name="family">Virtual Machine</entry>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </system>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </sysinfo>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <os>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <type arch="x86_64" machine="q35">hvm</type>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <boot dev="hd"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <smbios mode="sysinfo"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </os>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <features>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <acpi/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <apic/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <vmcoreinfo/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </features>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <clock offset="utc">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <timer name="pit" tickpolicy="delay"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <timer name="rtc" tickpolicy="catchup"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <timer name="hpet" present="no"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </clock>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <cpu mode="custom" match="exact">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <model>Nehalem</model>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <topology sockets="1" cores="1" threads="1"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </cpu>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  <devices>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <disk type="file" device="disk">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <target dev="vda" bus="virtio"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <disk type="file" device="cdrom">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <driver name="qemu" type="raw" cache="none"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <source file="/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.config"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <target dev="sda" bus="sata"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </disk>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <interface type="ethernet">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <mac address="fa:16:3e:6b:64:87"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <mtu size="1442"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <target dev="tap5ce00fa6-30"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </interface>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <serial type="pty">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <log file="/var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/console.log" append="off"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </serial>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <video>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <model type="virtio"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </video>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <input type="tablet" bus="usb"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <rng model="virtio">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <backend model="random">/dev/urandom</backend>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </rng>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="pci" model="pcie-root-port"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <controller type="usb" index="0"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    <memballoon model="virtio">
Dec  3 09:47:58 np0005544118 nova_compute[187283]:      <stats period="10"/>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:    </memballoon>
Dec  3 09:47:58 np0005544118 nova_compute[187283]:  </devices>
Dec  3 09:47:58 np0005544118 nova_compute[187283]: </domain>
Dec  3 09:47:58 np0005544118 nova_compute[187283]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.196 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Preparing to wait for external event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.196 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.196 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.196 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.197 187287 DEBUG nova.virt.libvirt.vif [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-03T14:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-762989702',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-762989702',id=27,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6ed557ae995c4dc49409420624bc5389',ramdisk_id='',reservation_id='r-drhhrkiv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:47:55Z,user_data=None,user_id='ce698eb5a4b84e229f3adc6e31ba044b',uuid=0d7f4cb9-d8d7-4db9-8c74-8cf11e501319,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.197 187287 DEBUG nova.network.os_vif_util [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Converting VIF {"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.198 187287 DEBUG nova.network.os_vif_util [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.199 187287 DEBUG os_vif [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.199 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.200 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.200 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.203 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.203 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5ce00fa6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.203 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5ce00fa6-30, col_values=(('external_ids', {'iface-id': '5ce00fa6-307c-470b-af47-ed25b5e0b312', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:64:87', 'vm-uuid': '0d7f4cb9-d8d7-4db9-8c74-8cf11e501319'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.204 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.2065] manager: (tap5ce00fa6-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.207 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.209 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.210 187287 INFO os_vif [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30')#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.249 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.250 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.250 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] No VIF found with MAC fa:16:3e:6b:64:87, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.251 187287 INFO nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Using config drive#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.518 187287 INFO nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Creating config drive at /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.config#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.522 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8oyxe2v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.651 187287 DEBUG oslo_concurrency.processutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw8oyxe2v" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:47:58 np0005544118 kernel: tap5ce00fa6-30: entered promiscuous mode
Dec  3 09:47:58 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:58Z|00244|binding|INFO|Claiming lport 5ce00fa6-307c-470b-af47-ed25b5e0b312 for this chassis.
Dec  3 09:47:58 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:58Z|00245|binding|INFO|5ce00fa6-307c-470b-af47-ed25b5e0b312: Claiming fa:16:3e:6b:64:87 10.100.0.7
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.7073] manager: (tap5ce00fa6-30): new Tun device (/org/freedesktop/NetworkManager/Devices/93)
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.707 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.709 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.723 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:64:87 10.100.0.7'], port_security=['fa:16:3e:6b:64:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0d7f4cb9-d8d7-4db9-8c74-8cf11e501319', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed557ae995c4dc49409420624bc5389', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'aa02460e-59ff-4b52-88f0-d6fc372580d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=790402b3-76a5-4b8b-b2dd-fb5e875c16a7, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=5ce00fa6-307c-470b-af47-ed25b5e0b312) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.724 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 5ce00fa6-307c-470b-af47-ed25b5e0b312 in datapath 063c09de-78bd-4f5c-9da9-f47a22f5ccd6 bound to our chassis#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.726 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 063c09de-78bd-4f5c-9da9-f47a22f5ccd6#033[00m
Dec  3 09:47:58 np0005544118 systemd-udevd[219509]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.741 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ae9a3c66-e3ca-42bd-9d41-508f1acc1492]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.743 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap063c09de-71 in ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.7439] device (tap5ce00fa6-30): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.7449] device (tap5ce00fa6-30): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.745 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap063c09de-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.745 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[56df0c7f-7982-4e0b-aa93-3abc8adcc92b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.747 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[582c2227-229d-4659-9413-22170f85f753]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 systemd-machined[153602]: New machine qemu-23-instance-0000001b.
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.761 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[380e8feb-1cab-4cac-abb3-37b6792f85ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.765 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:58Z|00246|binding|INFO|Setting lport 5ce00fa6-307c-470b-af47-ed25b5e0b312 ovn-installed in OVS
Dec  3 09:47:58 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:58Z|00247|binding|INFO|Setting lport 5ce00fa6-307c-470b-af47-ed25b5e0b312 up in Southbound
Dec  3 09:47:58 np0005544118 systemd[1]: Started Virtual Machine qemu-23-instance-0000001b.
Dec  3 09:47:58 np0005544118 nova_compute[187283]: 2025-12-03 14:47:58.772 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.777 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[af20add6-f7c5-46f6-a909-a994c0202352]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.802 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a851bf-22f0-4831-ad8d-c21835c418e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.8083] manager: (tap063c09de-70): new Veth device (/org/freedesktop/NetworkManager/Devices/94)
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.807 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8988d7ac-42b0-4c2e-a5a5-e7f7aeb8ce8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.839 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[517d7753-c4b0-48c7-baf5-9cfb8b26d314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.841 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4f578c-6277-4c57-99e3-f862377fc0c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 NetworkManager[55710]: <info>  [1764773278.8626] device (tap063c09de-70): carrier: link connected
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.868 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9a9c58-0416-44b8-b8d2-d3a19df291fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.886 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[89620114-9b65-4088-a8a8-ddbdde066e3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap063c09de-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:63:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543447, 'reachable_time': 40805, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219545, 'error': None, 'target': 'ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.904 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fc852e4c-14a1-4135-8b0e-44bcfb084fbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febd:6316'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 543447, 'tstamp': 543447}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219546, 'error': None, 'target': 'ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.920 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[9532219a-6ccb-4a91-b3c6-7bd21a34359c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap063c09de-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bd:63:16'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543447, 'reachable_time': 40805, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219547, 'error': None, 'target': 'ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:58 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:58.959 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8120280b-05a9-474c-914f-8af9cdb15f33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.024 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[902448d5-82e6-43e8-a8a6-0b832b7c6147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.025 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap063c09de-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.025 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.026 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap063c09de-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.027 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:59 np0005544118 NetworkManager[55710]: <info>  [1764773279.0285] manager: (tap063c09de-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Dec  3 09:47:59 np0005544118 kernel: tap063c09de-70: entered promiscuous mode
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.030 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.030 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap063c09de-70, col_values=(('external_ids', {'iface-id': '181ca954-b3a0-4794-af36-bcbc213da319'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:59 np0005544118 ovn_controller[95637]: 2025-12-03T14:47:59Z|00248|binding|INFO|Releasing lport 181ca954-b3a0-4794-af36-bcbc213da319 from this chassis (sb_readonly=0)
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.043 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.044 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/063c09de-78bd-4f5c-9da9-f47a22f5ccd6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/063c09de-78bd-4f5c-9da9-f47a22f5ccd6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.045 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[87da63f6-6123-4a9e-bf27-cbaff2c61db3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.045 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-063c09de-78bd-4f5c-9da9-f47a22f5ccd6
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/063c09de-78bd-4f5c-9da9-f47a22f5ccd6.pid.haproxy
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 063c09de-78bd-4f5c-9da9-f47a22f5ccd6
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:47:59 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:47:59.046 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'env', 'PROCESS_TAG=haproxy-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/063c09de-78bd-4f5c-9da9-f47a22f5ccd6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.143 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773279.143237, 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.144 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] VM Started (Lifecycle Event)#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.157 187287 DEBUG nova.compute.manager [req-264be357-8ac9-4f05-8b18-53b08e3f1a71 req-f9bd02ac-6c8f-4961-99c4-ce6b06e0ddf2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.158 187287 DEBUG oslo_concurrency.lockutils [req-264be357-8ac9-4f05-8b18-53b08e3f1a71 req-f9bd02ac-6c8f-4961-99c4-ce6b06e0ddf2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.158 187287 DEBUG oslo_concurrency.lockutils [req-264be357-8ac9-4f05-8b18-53b08e3f1a71 req-f9bd02ac-6c8f-4961-99c4-ce6b06e0ddf2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.158 187287 DEBUG oslo_concurrency.lockutils [req-264be357-8ac9-4f05-8b18-53b08e3f1a71 req-f9bd02ac-6c8f-4961-99c4-ce6b06e0ddf2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.158 187287 DEBUG nova.compute.manager [req-264be357-8ac9-4f05-8b18-53b08e3f1a71 req-f9bd02ac-6c8f-4961-99c4-ce6b06e0ddf2 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Processing event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.159 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.162 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.163 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.171 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.173 187287 INFO nova.virt.libvirt.driver [-] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Instance spawned successfully.#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.173 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.198 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.198 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773279.1435227, 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.198 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.203 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.204 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.204 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.205 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.205 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.205 187287 DEBUG nova.virt.libvirt.driver [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.231 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.235 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773279.1628222, 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.235 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.267 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.271 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.276 187287 INFO nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Took 3.69 seconds to spawn the instance on the hypervisor.#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.276 187287 DEBUG nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.286 187287 DEBUG nova.network.neutron [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updated VIF entry in instance network info cache for port 5ce00fa6-307c-470b-af47-ed25b5e0b312. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.287 187287 DEBUG nova.network.neutron [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updating instance_info_cache with network_info: [{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.289 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.309 187287 DEBUG oslo_concurrency.lockutils [req-2df8c5a6-27d8-443c-a8bd-d964c207f8cf req-02fba877-76ea-459e-8699-47890e9cd123 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.341 187287 INFO nova.compute.manager [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Took 4.10 seconds to build instance.#033[00m
Dec  3 09:47:59 np0005544118 nova_compute[187283]: 2025-12-03 14:47:59.366 187287 DEBUG oslo_concurrency.lockutils [None req-158ab99d-fbfe-4761-a4c9-4f286b2dbd9c ce698eb5a4b84e229f3adc6e31ba044b 6ed557ae995c4dc49409420624bc5389 - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 4.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:47:59 np0005544118 podman[219583]: 2025-12-03 14:47:59.41365909 +0000 UTC m=+0.050416908 container create 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:47:59 np0005544118 systemd[1]: Started libpod-conmon-5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272.scope.
Dec  3 09:47:59 np0005544118 podman[219583]: 2025-12-03 14:47:59.38545289 +0000 UTC m=+0.022210728 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:47:59 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:47:59 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae5f302899357e096bf9ef756ced0cc12f1cb41ef869361723e1833ae104fe58/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:47:59 np0005544118 podman[219583]: 2025-12-03 14:47:59.50438895 +0000 UTC m=+0.141146798 container init 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec  3 09:47:59 np0005544118 podman[219596]: 2025-12-03 14:47:59.506493477 +0000 UTC m=+0.065420329 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  3 09:47:59 np0005544118 podman[219583]: 2025-12-03 14:47:59.510412054 +0000 UTC m=+0.147169872 container start 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  3 09:47:59 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [NOTICE]   (219624) : New worker (219626) forked
Dec  3 09:47:59 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [NOTICE]   (219624) : Loading success.
Dec  3 09:48:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:00.987 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:00 np0005544118 nova_compute[187283]: 2025-12-03 14:48:00.999 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.242 187287 DEBUG nova.compute.manager [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.243 187287 DEBUG oslo_concurrency.lockutils [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.243 187287 DEBUG oslo_concurrency.lockutils [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.243 187287 DEBUG oslo_concurrency.lockutils [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.244 187287 DEBUG nova.compute.manager [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:01 np0005544118 nova_compute[187283]: 2025-12-03 14:48:01.244 187287 WARNING nova.compute.manager [req-a05f203f-7e6b-4e5d-bd17-1b18c16c2efa req-c445ff88-a8de-4aec-ab12-a0fff441647a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received unexpected event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with vm_state active and task_state None.#033[00m
Dec  3 09:48:03 np0005544118 nova_compute[187283]: 2025-12-03 14:48:03.206 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:03 np0005544118 podman[219635]: 2025-12-03 14:48:03.856773926 +0000 UTC m=+0.080961543 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Dec  3 09:48:05 np0005544118 podman[197639]: time="2025-12-03T14:48:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:48:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:48:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 18324 "" "Go-http-client/1.1"
Dec  3 09:48:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:48:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3066 "" "Go-http-client/1.1"
Dec  3 09:48:06 np0005544118 nova_compute[187283]: 2025-12-03 14:48:06.020 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:08 np0005544118 nova_compute[187283]: 2025-12-03 14:48:08.261 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:09 np0005544118 podman[219655]: 2025-12-03 14:48:09.814971113 +0000 UTC m=+0.047335774 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true)
Dec  3 09:48:11 np0005544118 nova_compute[187283]: 2025-12-03 14:48:11.022 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:12Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6b:64:87 10.100.0.7
Dec  3 09:48:12 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:12Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6b:64:87 10.100.0.7
Dec  3 09:48:13 np0005544118 nova_compute[187283]: 2025-12-03 14:48:13.263 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:14 np0005544118 podman[219689]: 2025-12-03 14:48:14.817477234 +0000 UTC m=+0.042135263 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:48:16 np0005544118 nova_compute[187283]: 2025-12-03 14:48:16.023 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:18 np0005544118 nova_compute[187283]: 2025-12-03 14:48:18.267 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:18 np0005544118 nova_compute[187283]: 2025-12-03 14:48:18.389 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Check if temp file /var/lib/nova/instances/tmp1qpqhwj1 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Dec  3 09:48:18 np0005544118 nova_compute[187283]: 2025-12-03 14:48:18.390 187287 DEBUG nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1qpqhwj1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0d7f4cb9-d8d7-4db9-8c74-8cf11e501319',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Dec  3 09:48:19 np0005544118 nova_compute[187283]: 2025-12-03 14:48:19.006 187287 DEBUG oslo_concurrency.processutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:48:19 np0005544118 nova_compute[187283]: 2025-12-03 14:48:19.063 187287 DEBUG oslo_concurrency.processutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:48:19 np0005544118 nova_compute[187283]: 2025-12-03 14:48:19.064 187287 DEBUG oslo_concurrency.processutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:48:19 np0005544118 nova_compute[187283]: 2025-12-03 14:48:19.119 187287 DEBUG oslo_concurrency.processutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:48:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:48:21 np0005544118 systemd[1]: Created slice User Slice of UID 42436.
Dec  3 09:48:21 np0005544118 nova_compute[187283]: 2025-12-03 14:48:21.025 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:21 np0005544118 systemd[1]: Starting User Runtime Directory /run/user/42436...
Dec  3 09:48:21 np0005544118 systemd-logind[795]: New session 34 of user nova.
Dec  3 09:48:21 np0005544118 systemd[1]: Finished User Runtime Directory /run/user/42436.
Dec  3 09:48:21 np0005544118 systemd[1]: Starting User Manager for UID 42436...
Dec  3 09:48:21 np0005544118 podman[219719]: 2025-12-03 14:48:21.14156857 +0000 UTC m=+0.128619025 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:48:21 np0005544118 systemd[219734]: Queued start job for default target Main User Target.
Dec  3 09:48:21 np0005544118 systemd[219734]: Created slice User Application Slice.
Dec  3 09:48:21 np0005544118 systemd[219734]: Started Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:48:21 np0005544118 systemd[219734]: Started Daily Cleanup of User's Temporary Directories.
Dec  3 09:48:21 np0005544118 systemd[219734]: Reached target Paths.
Dec  3 09:48:21 np0005544118 systemd[219734]: Reached target Timers.
Dec  3 09:48:21 np0005544118 systemd[219734]: Starting D-Bus User Message Bus Socket...
Dec  3 09:48:21 np0005544118 systemd[219734]: Starting Create User's Volatile Files and Directories...
Dec  3 09:48:21 np0005544118 systemd[219734]: Finished Create User's Volatile Files and Directories.
Dec  3 09:48:21 np0005544118 systemd[219734]: Listening on D-Bus User Message Bus Socket.
Dec  3 09:48:21 np0005544118 systemd[219734]: Reached target Sockets.
Dec  3 09:48:21 np0005544118 systemd[219734]: Reached target Basic System.
Dec  3 09:48:21 np0005544118 systemd[219734]: Reached target Main User Target.
Dec  3 09:48:21 np0005544118 systemd[219734]: Startup finished in 139ms.
Dec  3 09:48:21 np0005544118 systemd[1]: Started User Manager for UID 42436.
Dec  3 09:48:21 np0005544118 systemd[1]: Started Session 34 of User nova.
Dec  3 09:48:21 np0005544118 systemd[1]: session-34.scope: Deactivated successfully.
Dec  3 09:48:21 np0005544118 systemd-logind[795]: Session 34 logged out. Waiting for processes to exit.
Dec  3 09:48:21 np0005544118 systemd-logind[795]: Removed session 34.
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.243 187287 DEBUG nova.compute.manager [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.244 187287 DEBUG oslo_concurrency.lockutils [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.244 187287 DEBUG oslo_concurrency.lockutils [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.244 187287 DEBUG oslo_concurrency.lockutils [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.244 187287 DEBUG nova.compute.manager [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:22 np0005544118 nova_compute[187283]: 2025-12-03 14:48:22.245 187287 DEBUG nova.compute.manager [req-508bcd25-8651-4a88-919e-6abb92b870b6 req-5d01f795-e3d6-4583-861b-38626b95d829 c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:48:23 np0005544118 nova_compute[187283]: 2025-12-03 14:48:23.272 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.322 187287 DEBUG nova.compute.manager [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.323 187287 DEBUG oslo_concurrency.lockutils [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.323 187287 DEBUG oslo_concurrency.lockutils [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.324 187287 DEBUG oslo_concurrency.lockutils [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.324 187287 DEBUG nova.compute.manager [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:24 np0005544118 nova_compute[187283]: 2025-12-03 14:48:24.325 187287 WARNING nova.compute.manager [req-54202ecc-affa-4c24-bffd-b275b7bb505a req-9581d27b-13ed-4551-90e7-e6ce4fd842df c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received unexpected event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.949 187287 INFO nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Took 6.83 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.950 187287 DEBUG nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.966 187287 DEBUG nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp1qpqhwj1',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='0d7f4cb9-d8d7-4db9-8c74-8cf11e501319',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c7a2d69f-99e7-46b1-aef5-ab45b08a7286),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.987 187287 DEBUG nova.objects.instance [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lazy-loading 'migration_context' on Instance uuid 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.988 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.989 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Dec  3 09:48:25 np0005544118 nova_compute[187283]: 2025-12-03 14:48:25.990 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.005 187287 DEBUG nova.virt.libvirt.vif [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-762989702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-762989702',id=27,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:47:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6ed557ae995c4dc49409420624bc5389',ramdisk_id='',reservation_id='r-drhhrkiv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:47:59Z,user_data=None,user_id='ce698eb5a4b84e229f3adc6e31ba044b',uuid=0d7f4cb9-d8d7-4db9-8c74-8cf11e501319,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.005 187287 DEBUG nova.network.os_vif_util [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converting VIF {"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.006 187287 DEBUG nova.network.os_vif_util [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.006 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updating guest XML with vif config: <interface type="ethernet">
Dec  3 09:48:26 np0005544118 nova_compute[187283]:  <mac address="fa:16:3e:6b:64:87"/>
Dec  3 09:48:26 np0005544118 nova_compute[187283]:  <model type="virtio"/>
Dec  3 09:48:26 np0005544118 nova_compute[187283]:  <driver name="vhost" rx_queue_size="512"/>
Dec  3 09:48:26 np0005544118 nova_compute[187283]:  <mtu size="1442"/>
Dec  3 09:48:26 np0005544118 nova_compute[187283]:  <target dev="tap5ce00fa6-30"/>
Dec  3 09:48:26 np0005544118 nova_compute[187283]: </interface>
Dec  3 09:48:26 np0005544118 nova_compute[187283]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.007 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.027 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.397 187287 DEBUG nova.compute.manager [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-changed-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.398 187287 DEBUG nova.compute.manager [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Refreshing instance network info cache due to event network-changed-5ce00fa6-307c-470b-af47-ed25b5e0b312. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.398 187287 DEBUG oslo_concurrency.lockutils [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.398 187287 DEBUG oslo_concurrency.lockutils [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.398 187287 DEBUG nova.network.neutron [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Refreshing network info cache for port 5ce00fa6-307c-470b-af47-ed25b5e0b312 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.492 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Current None elapsed 0 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.492 187287 INFO nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Dec  3 09:48:26 np0005544118 nova_compute[187283]: 2025-12-03 14:48:26.569 187287 INFO nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Dec  3 09:48:26 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.072 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.072 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:48:27 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:48:27 np0005544118 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.576 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Current 50 elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.577 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.807 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773307.8073783, 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.808 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] VM Paused (Lifecycle Event)#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.832 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.837 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:48:27 np0005544118 nova_compute[187283]: 2025-12-03 14:48:27.909 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.080 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.081 187287 DEBUG nova.virt.libvirt.migration [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Dec  3 09:48:28 np0005544118 kernel: tap5ce00fa6-30 (unregistering): left promiscuous mode
Dec  3 09:48:28 np0005544118 NetworkManager[55710]: <info>  [1764773308.1114] device (tap5ce00fa6-30): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:48:28 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:28Z|00249|binding|INFO|Releasing lport 5ce00fa6-307c-470b-af47-ed25b5e0b312 from this chassis (sb_readonly=0)
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.120 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:28Z|00250|binding|INFO|Setting lport 5ce00fa6-307c-470b-af47-ed25b5e0b312 down in Southbound
Dec  3 09:48:28 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:28Z|00251|binding|INFO|Removing iface tap5ce00fa6-30 ovn-installed in OVS
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Dec  3 09:48:28 np0005544118 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001b.scope: Consumed 13.820s CPU time.
Dec  3 09:48:28 np0005544118 systemd-machined[153602]: Machine qemu-23-instance-0000001b terminated.
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.274 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.296 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:64:87 10.100.0.7'], port_security=['fa:16:3e:6b:64:87 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '3a9d7e7b-04f9-4aed-a199-9003ff5fe58c'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '0d7f4cb9-d8d7-4db9-8c74-8cf11e501319', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6ed557ae995c4dc49409420624bc5389', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'aa02460e-59ff-4b52-88f0-d6fc372580d6', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=790402b3-76a5-4b8b-b2dd-fb5e875c16a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=5ce00fa6-307c-470b-af47-ed25b5e0b312) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.298 104491 INFO neutron.agent.ovn.metadata.agent [-] Port 5ce00fa6-307c-470b-af47-ed25b5e0b312 in datapath 063c09de-78bd-4f5c-9da9-f47a22f5ccd6 unbound from our chassis#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.300 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 063c09de-78bd-4f5c-9da9-f47a22f5ccd6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.301 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[809712fa-755d-4e69-8deb-91a0a1b5e2bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.302 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6 namespace which is not needed anymore#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.363 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.363 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.364 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Dec  3 09:48:28 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [NOTICE]   (219624) : haproxy version is 2.8.14-c23fe91
Dec  3 09:48:28 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [NOTICE]   (219624) : path to executable is /usr/sbin/haproxy
Dec  3 09:48:28 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [WARNING]  (219624) : Exiting Master process...
Dec  3 09:48:28 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [ALERT]    (219624) : Current worker (219626) exited with code 143 (Terminated)
Dec  3 09:48:28 np0005544118 neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6[219608]: [WARNING]  (219624) : All workers exited. Exiting... (0)
Dec  3 09:48:28 np0005544118 systemd[1]: libpod-5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272.scope: Deactivated successfully.
Dec  3 09:48:28 np0005544118 podman[219813]: 2025-12-03 14:48:28.443843895 +0000 UTC m=+0.044426235 container died 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125)
Dec  3 09:48:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay-ae5f302899357e096bf9ef756ced0cc12f1cb41ef869361723e1833ae104fe58-merged.mount: Deactivated successfully.
Dec  3 09:48:28 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272-userdata-shm.mount: Deactivated successfully.
Dec  3 09:48:28 np0005544118 podman[219813]: 2025-12-03 14:48:28.47804463 +0000 UTC m=+0.078626970 container cleanup 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:48:28 np0005544118 systemd[1]: libpod-conmon-5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272.scope: Deactivated successfully.
Dec  3 09:48:28 np0005544118 podman[219845]: 2025-12-03 14:48:28.536243549 +0000 UTC m=+0.038751979 container remove 5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.541 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[fd458228-34a8-46da-b5f4-bf883a21895a]: (4, ('Wed Dec  3 02:48:28 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6 (5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272)\n5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272\nWed Dec  3 02:48:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6 (5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272)\n5894b43283fdca1c913b0da775a8b7a6ae7bbf08d3617f053c65d4fbca6a9272\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.544 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[11af1e5e-b8af-4ee0-9831-c582c6b8880e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.545 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap063c09de-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.547 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 kernel: tap063c09de-70: left promiscuous mode
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.564 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.567 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.571 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[29634825-3c7b-46b3-82d1-22ab3f9f77e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.584 187287 DEBUG nova.virt.libvirt.guest [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '0d7f4cb9-d8d7-4db9-8c74-8cf11e501319' (instance-0000001b) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.585 187287 INFO nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migration operation has completed#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.585 187287 INFO nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] _post_live_migration() is started..#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.584 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6ab0bc-de3e-4d9c-beb2-d38c45485231]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.586 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a4267ba8-f68b-426f-95fb-fff4fb16fdfa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.599 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[7da6275d-9b6d-4be0-988f-b9af88aea2be]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 543440, 'reachable_time': 25884, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219864, 'error': None, 'target': 'ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.601 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-063c09de-78bd-4f5c-9da9-f47a22f5ccd6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:48:28 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:28.602 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[5584c90c-56e9-44d6-bb5f-a7329eedea9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:48:28 np0005544118 systemd[1]: run-netns-ovnmeta\x2d063c09de\x2d78bd\x2d4f5c\x2d9da9\x2df47a22f5ccd6.mount: Deactivated successfully.
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.670 187287 DEBUG nova.network.neutron [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updated VIF entry in instance network info cache for port 5ce00fa6-307c-470b-af47-ed25b5e0b312. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.670 187287 DEBUG nova.network.neutron [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Updating instance_info_cache with network_info: [{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.690 187287 DEBUG oslo_concurrency.lockutils [req-a52855e3-ed1b-4815-b58b-d7821c3dbfaa req-b57579af-a8f9-47b1-8bc4-4d65743dc40c c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-0d7f4cb9-d8d7-4db9-8c74-8cf11e501319" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.869 187287 DEBUG nova.compute.manager [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.870 187287 DEBUG oslo_concurrency.lockutils [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.870 187287 DEBUG oslo_concurrency.lockutils [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.870 187287 DEBUG oslo_concurrency.lockutils [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.870 187287 DEBUG nova.compute.manager [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:28 np0005544118 nova_compute[187283]: 2025-12-03 14:48:28.870 187287 DEBUG nova.compute.manager [req-4cd1f23b-96a4-4d99-bc23-9db3ef88c4d9 req-5e571353-3f39-4436-a857-ff7d5bfa5c4a c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-unplugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:48:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:29.278 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.278 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:29 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:29.279 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.375 187287 DEBUG nova.network.neutron [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Activated binding for port 5ce00fa6-307c-470b-af47-ed25b5e0b312 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.376 187287 DEBUG nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.377 187287 DEBUG nova.virt.libvirt.vif [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:47:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalancingStrategy-server-762989702',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancingstrategy-server-762989702',id=27,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:47:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6ed557ae995c4dc49409420624bc5389',ramdisk_id='',reservation_id='r-drhhrkiv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197',owner_user_name='tempest-TestExecuteWorkloadBalancingStrategy-679609197-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:48:15Z,user_data=None,user_id='ce698eb5a4b84e229f3adc6e31ba044b',uuid=0d7f4cb9-d8d7-4db9-8c74-8cf11e501319,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.378 187287 DEBUG nova.network.os_vif_util [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converting VIF {"id": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "address": "fa:16:3e:6b:64:87", "network": {"id": "063c09de-78bd-4f5c-9da9-f47a22f5ccd6", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalancingStrategy-272284966-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6ed557ae995c4dc49409420624bc5389", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ce00fa6-30", "ovs_interfaceid": "5ce00fa6-307c-470b-af47-ed25b5e0b312", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.379 187287 DEBUG nova.network.os_vif_util [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.379 187287 DEBUG os_vif [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.381 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.382 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5ce00fa6-30, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.383 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.386 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.389 187287 INFO os_vif [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:64:87,bridge_name='br-int',has_traffic_filtering=True,id=5ce00fa6-307c-470b-af47-ed25b5e0b312,network=Network(063c09de-78bd-4f5c-9da9-f47a22f5ccd6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ce00fa6-30')#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.389 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.390 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.390 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.391 187287 DEBUG nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.391 187287 INFO nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Deleting instance files /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319_del#033[00m
Dec  3 09:48:29 np0005544118 nova_compute[187283]: 2025-12-03 14:48:29.392 187287 INFO nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Deletion of /var/lib/nova/instances/0d7f4cb9-d8d7-4db9-8c74-8cf11e501319_del complete#033[00m
Dec  3 09:48:29 np0005544118 podman[219865]: 2025-12-03 14:48:29.863160309 +0000 UTC m=+0.084887440 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.946 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.946 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.947 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.947 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.947 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.947 187287 WARNING nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received unexpected event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.948 187287 WARNING nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received unexpected event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 DEBUG oslo_concurrency.lockutils [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 DEBUG nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] No waiting events found dispatching network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:48:30 np0005544118 nova_compute[187283]: 2025-12-03 14:48:30.949 187287 WARNING nova.compute.manager [req-bab7ba8f-28ab-4eb2-96cb-047e5c52f7e6 req-dca9ca68-bc13-4626-8628-452b0a5481fd c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Received unexpected event network-vif-plugged-5ce00fa6-307c-470b-af47-ed25b5e0b312 for instance with vm_state active and task_state migrating.#033[00m
Dec  3 09:48:31 np0005544118 nova_compute[187283]: 2025-12-03 14:48:31.029 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:31 np0005544118 systemd[1]: Stopping User Manager for UID 42436...
Dec  3 09:48:31 np0005544118 systemd[219734]: Activating special unit Exit the Session...
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped target Main User Target.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped target Basic System.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped target Paths.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped target Sockets.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped target Timers.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped Mark boot as successful after the user session has run 2 minutes.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped Daily Cleanup of User's Temporary Directories.
Dec  3 09:48:31 np0005544118 systemd[219734]: Closed D-Bus User Message Bus Socket.
Dec  3 09:48:31 np0005544118 systemd[219734]: Stopped Create User's Volatile Files and Directories.
Dec  3 09:48:31 np0005544118 systemd[219734]: Removed slice User Application Slice.
Dec  3 09:48:31 np0005544118 systemd[219734]: Reached target Shutdown.
Dec  3 09:48:31 np0005544118 systemd[219734]: Finished Exit the Session.
Dec  3 09:48:31 np0005544118 systemd[219734]: Reached target Exit the Session.
Dec  3 09:48:31 np0005544118 systemd[1]: user@42436.service: Deactivated successfully.
Dec  3 09:48:31 np0005544118 systemd[1]: Stopped User Manager for UID 42436.
Dec  3 09:48:31 np0005544118 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Dec  3 09:48:31 np0005544118 systemd[1]: run-user-42436.mount: Deactivated successfully.
Dec  3 09:48:31 np0005544118 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Dec  3 09:48:31 np0005544118 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Dec  3 09:48:31 np0005544118 systemd[1]: Removed slice User Slice of UID 42436.
Dec  3 09:48:32 np0005544118 nova_compute[187283]: 2025-12-03 14:48:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:33 np0005544118 nova_compute[187283]: 2025-12-03 14:48:33.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.060 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.061 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.061 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "0d7f4cb9-d8d7-4db9-8c74-8cf11e501319-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.092 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.092 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.092 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.092 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:48:34 np0005544118 podman[219890]: 2025-12-03 14:48:34.21648046 +0000 UTC m=+0.074964679 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.251 187287 WARNING nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.252 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5854MB free_disk=73.3337631225586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.253 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.253 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.307 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Migration for instance 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.324 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.347 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Migration c7a2d69f-99e7-46b1-aef5-ab45b08a7286 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.347 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.347 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.385 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.426 187287 DEBUG nova.compute.provider_tree [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.453 187287 DEBUG nova.scheduler.client.report [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.481 187287 DEBUG nova.compute.resource_tracker [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.482 187287 DEBUG oslo_concurrency.lockutils [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.492 187287 INFO nova.compute.manager [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.715 187287 INFO nova.scheduler.client.report [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Deleted allocation for migration c7a2d69f-99e7-46b1-aef5-ab45b08a7286#033[00m
Dec  3 09:48:34 np0005544118 nova_compute[187283]: 2025-12-03 14:48:34.716 187287 DEBUG nova.virt.libvirt.driver [None req-9de90d72-85a5-43c2-addc-c0b00f809584 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Dec  3 09:48:35 np0005544118 podman[197639]: time="2025-12-03T14:48:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:48:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:48:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:48:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:48:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  3 09:48:36 np0005544118 nova_compute[187283]: 2025-12-03 14:48:36.032 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:37 np0005544118 nova_compute[187283]: 2025-12-03 14:48:37.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:37 np0005544118 nova_compute[187283]: 2025-12-03 14:48:37.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:48:37 np0005544118 nova_compute[187283]: 2025-12-03 14:48:37.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:48:37 np0005544118 nova_compute[187283]: 2025-12-03 14:48:37.630 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:48:38 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:48:38.281 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.627 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.628 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.628 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.628 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.772 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.773 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5862MB free_disk=73.33374404907227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.773 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.773 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.947 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.947 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:48:38 np0005544118 nova_compute[187283]: 2025-12-03 14:48:38.989 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:48:39 np0005544118 nova_compute[187283]: 2025-12-03 14:48:39.006 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:48:39 np0005544118 nova_compute[187283]: 2025-12-03 14:48:39.007 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:48:39 np0005544118 nova_compute[187283]: 2025-12-03 14:48:39.008 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:48:39 np0005544118 nova_compute[187283]: 2025-12-03 14:48:39.386 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:40 np0005544118 podman[219910]: 2025-12-03 14:48:40.813418701 +0000 UTC m=+0.050096731 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec  3 09:48:41 np0005544118 nova_compute[187283]: 2025-12-03 14:48:41.004 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:41 np0005544118 nova_compute[187283]: 2025-12-03 14:48:41.033 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:43 np0005544118 nova_compute[187283]: 2025-12-03 14:48:43.361 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773308.359859, 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:48:43 np0005544118 nova_compute[187283]: 2025-12-03 14:48:43.361 187287 INFO nova.compute.manager [-] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:48:43 np0005544118 nova_compute[187283]: 2025-12-03 14:48:43.392 187287 DEBUG nova.compute.manager [None req-28dcf20f-dffe-4d74-a2e9-3d815d515a49 - - - - - -] [instance: 0d7f4cb9-d8d7-4db9-8c74-8cf11e501319] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:48:44 np0005544118 nova_compute[187283]: 2025-12-03 14:48:44.388 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:45 np0005544118 nova_compute[187283]: 2025-12-03 14:48:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:48:45 np0005544118 nova_compute[187283]: 2025-12-03 14:48:45.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:48:45 np0005544118 podman[219929]: 2025-12-03 14:48:45.824435905 +0000 UTC m=+0.057369609 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:48:46 np0005544118 nova_compute[187283]: 2025-12-03 14:48:46.068 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:49 np0005544118 nova_compute[187283]: 2025-12-03 14:48:49.390 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:48:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:48:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:48:51 np0005544118 nova_compute[187283]: 2025-12-03 14:48:51.072 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:51 np0005544118 podman[219955]: 2025-12-03 14:48:51.8744367 +0000 UTC m=+0.110536071 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:48:54 np0005544118 nova_compute[187283]: 2025-12-03 14:48:54.392 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:56 np0005544118 nova_compute[187283]: 2025-12-03 14:48:56.073 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:48:58 np0005544118 ovn_controller[95637]: 2025-12-03T14:48:58Z|00252|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Dec  3 09:48:59 np0005544118 nova_compute[187283]: 2025-12-03 14:48:59.394 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:00 np0005544118 podman[219982]: 2025-12-03 14:49:00.824628417 +0000 UTC m=+0.059846696 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:49:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:49:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:49:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:49:01 np0005544118 nova_compute[187283]: 2025-12-03 14:49:01.076 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:04 np0005544118 nova_compute[187283]: 2025-12-03 14:49:04.396 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:04 np0005544118 podman[220003]: 2025-12-03 14:49:04.822652429 +0000 UTC m=+0.055974249 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec  3 09:49:05 np0005544118 podman[197639]: time="2025-12-03T14:49:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:49:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:49:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:49:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:49:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec  3 09:49:06 np0005544118 nova_compute[187283]: 2025-12-03 14:49:06.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:09 np0005544118 nova_compute[187283]: 2025-12-03 14:49:09.398 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:11 np0005544118 nova_compute[187283]: 2025-12-03 14:49:11.081 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:11 np0005544118 podman[220023]: 2025-12-03 14:49:11.828253446 +0000 UTC m=+0.058157859 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:49:14 np0005544118 nova_compute[187283]: 2025-12-03 14:49:14.401 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:16 np0005544118 nova_compute[187283]: 2025-12-03 14:49:16.084 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:16 np0005544118 podman[220042]: 2025-12-03 14:49:16.847934216 +0000 UTC m=+0.080302993 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:49:17 np0005544118 nova_compute[187283]: 2025-12-03 14:49:17.109 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:17.110 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:49:17 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:17.112 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:49:19 np0005544118 nova_compute[187283]: 2025-12-03 14:49:19.403 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:49:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:49:21 np0005544118 nova_compute[187283]: 2025-12-03 14:49:21.086 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:22 np0005544118 podman[220066]: 2025-12-03 14:49:22.854330531 +0000 UTC m=+0.085051361 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec  3 09:49:24 np0005544118 nova_compute[187283]: 2025-12-03 14:49:24.406 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:26 np0005544118 nova_compute[187283]: 2025-12-03 14:49:26.090 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:26 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:49:26.115 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:49:28 np0005544118 nova_compute[187283]: 2025-12-03 14:49:28.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:28 np0005544118 nova_compute[187283]: 2025-12-03 14:49:28.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:29 np0005544118 nova_compute[187283]: 2025-12-03 14:49:29.407 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:30 np0005544118 nova_compute[187283]: 2025-12-03 14:49:30.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:31 np0005544118 nova_compute[187283]: 2025-12-03 14:49:31.092 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:31 np0005544118 podman[220093]: 2025-12-03 14:49:31.814450568 +0000 UTC m=+0.051915851 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9-minimal)
Dec  3 09:49:34 np0005544118 nova_compute[187283]: 2025-12-03 14:49:34.411 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:34 np0005544118 nova_compute[187283]: 2025-12-03 14:49:34.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:35 np0005544118 nova_compute[187283]: 2025-12-03 14:49:35.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:35 np0005544118 podman[197639]: time="2025-12-03T14:49:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:49:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:49:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:49:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:49:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  3 09:49:35 np0005544118 podman[220115]: 2025-12-03 14:49:35.894358931 +0000 UTC m=+0.126747818 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:49:36 np0005544118 nova_compute[187283]: 2025-12-03 14:49:36.095 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.659 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.659 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.683 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.684 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.684 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.685 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.819 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.820 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5874MB free_disk=73.3337631225586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.820 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.821 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.907 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.907 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.928 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.950 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.950 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.970 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:49:38 np0005544118 nova_compute[187283]: 2025-12-03 14:49:38.992 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:49:39 np0005544118 nova_compute[187283]: 2025-12-03 14:49:39.012 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:49:39 np0005544118 nova_compute[187283]: 2025-12-03 14:49:39.032 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:49:39 np0005544118 nova_compute[187283]: 2025-12-03 14:49:39.034 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:49:39 np0005544118 nova_compute[187283]: 2025-12-03 14:49:39.035 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:49:39 np0005544118 nova_compute[187283]: 2025-12-03 14:49:39.413 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:41 np0005544118 nova_compute[187283]: 2025-12-03 14:49:41.029 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:41 np0005544118 nova_compute[187283]: 2025-12-03 14:49:41.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:42 np0005544118 podman[220136]: 2025-12-03 14:49:42.849440104 +0000 UTC m=+0.082801261 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec  3 09:49:43 np0005544118 nova_compute[187283]: 2025-12-03 14:49:43.601 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:44 np0005544118 nova_compute[187283]: 2025-12-03 14:49:44.415 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:45 np0005544118 nova_compute[187283]: 2025-12-03 14:49:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:49:45 np0005544118 nova_compute[187283]: 2025-12-03 14:49:45.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:49:46 np0005544118 nova_compute[187283]: 2025-12-03 14:49:46.099 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:47 np0005544118 ovn_controller[95637]: 2025-12-03T14:49:47Z|00253|memory_trim|INFO|Detected inactivity (last active 30021 ms ago): trimming memory
Dec  3 09:49:47 np0005544118 podman[220157]: 2025-12-03 14:49:47.808238135 +0000 UTC m=+0.042411888 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:49:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:49:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:49:49 np0005544118 nova_compute[187283]: 2025-12-03 14:49:49.416 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:51 np0005544118 nova_compute[187283]: 2025-12-03 14:49:51.100 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:53 np0005544118 podman[220181]: 2025-12-03 14:49:53.836969799 +0000 UTC m=+0.070887504 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:49:54 np0005544118 nova_compute[187283]: 2025-12-03 14:49:54.418 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:56 np0005544118 nova_compute[187283]: 2025-12-03 14:49:56.102 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:49:59 np0005544118 nova_compute[187283]: 2025-12-03 14:49:59.421 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:00.988 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:50:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:00.989 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:50:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:00.989 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:50:01 np0005544118 nova_compute[187283]: 2025-12-03 14:50:01.104 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:02 np0005544118 podman[220207]: 2025-12-03 14:50:02.819209714 +0000 UTC m=+0.056468513 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec  3 09:50:04 np0005544118 nova_compute[187283]: 2025-12-03 14:50:04.423 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:05 np0005544118 podman[197639]: time="2025-12-03T14:50:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:50:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:50:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:50:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:50:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec  3 09:50:06 np0005544118 nova_compute[187283]: 2025-12-03 14:50:06.106 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:06 np0005544118 podman[220228]: 2025-12-03 14:50:06.819584294 +0000 UTC m=+0.048648824 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:50:08 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:08.977 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:50:08 np0005544118 nova_compute[187283]: 2025-12-03 14:50:08.977 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:08 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:08.978 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:50:08 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:50:08.979 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:50:09 np0005544118 nova_compute[187283]: 2025-12-03 14:50:09.425 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:10 np0005544118 nova_compute[187283]: 2025-12-03 14:50:10.297 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:11 np0005544118 nova_compute[187283]: 2025-12-03 14:50:11.107 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:13 np0005544118 podman[220248]: 2025-12-03 14:50:13.814752311 +0000 UTC m=+0.047433982 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:50:14 np0005544118 nova_compute[187283]: 2025-12-03 14:50:14.427 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:16 np0005544118 nova_compute[187283]: 2025-12-03 14:50:16.108 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:18 np0005544118 podman[220267]: 2025-12-03 14:50:18.81639195 +0000 UTC m=+0.049589500 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:50:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:50:19 np0005544118 nova_compute[187283]: 2025-12-03 14:50:19.429 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:21 np0005544118 nova_compute[187283]: 2025-12-03 14:50:21.112 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:24 np0005544118 nova_compute[187283]: 2025-12-03 14:50:24.431 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:24 np0005544118 podman[220291]: 2025-12-03 14:50:24.860058181 +0000 UTC m=+0.090987539 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:50:26 np0005544118 nova_compute[187283]: 2025-12-03 14:50:26.119 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:28 np0005544118 nova_compute[187283]: 2025-12-03 14:50:28.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:29 np0005544118 nova_compute[187283]: 2025-12-03 14:50:29.434 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:29 np0005544118 nova_compute[187283]: 2025-12-03 14:50:29.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:31 np0005544118 nova_compute[187283]: 2025-12-03 14:50:31.121 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:32 np0005544118 nova_compute[187283]: 2025-12-03 14:50:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:33 np0005544118 podman[220318]: 2025-12-03 14:50:33.829445554 +0000 UTC m=+0.064525126 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, architecture=x86_64, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec  3 09:50:34 np0005544118 nova_compute[187283]: 2025-12-03 14:50:34.436 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:35 np0005544118 nova_compute[187283]: 2025-12-03 14:50:35.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:35 np0005544118 podman[197639]: time="2025-12-03T14:50:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:50:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:50:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:50:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:50:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  3 09:50:36 np0005544118 nova_compute[187283]: 2025-12-03 14:50:36.122 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:36 np0005544118 nova_compute[187283]: 2025-12-03 14:50:36.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:37 np0005544118 podman[220339]: 2025-12-03 14:50:37.843524409 +0000 UTC m=+0.078413576 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.632 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.633 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.633 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.633 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.774 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.775 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5867MB free_disk=73.33425903320312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.775 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.776 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.866 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.867 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.889 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.905 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.906 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:50:38 np0005544118 nova_compute[187283]: 2025-12-03 14:50:38.906 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:50:39 np0005544118 nova_compute[187283]: 2025-12-03 14:50:39.441 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:40 np0005544118 nova_compute[187283]: 2025-12-03 14:50:40.905 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:40 np0005544118 nova_compute[187283]: 2025-12-03 14:50:40.905 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:50:40 np0005544118 nova_compute[187283]: 2025-12-03 14:50:40.905 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:50:40 np0005544118 nova_compute[187283]: 2025-12-03 14:50:40.925 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:50:41 np0005544118 nova_compute[187283]: 2025-12-03 14:50:41.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:41 np0005544118 nova_compute[187283]: 2025-12-03 14:50:41.623 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:44 np0005544118 nova_compute[187283]: 2025-12-03 14:50:44.443 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:44 np0005544118 podman[220359]: 2025-12-03 14:50:44.844740577 +0000 UTC m=+0.079630507 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec  3 09:50:46 np0005544118 nova_compute[187283]: 2025-12-03 14:50:46.125 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:46 np0005544118 nova_compute[187283]: 2025-12-03 14:50:46.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:46 np0005544118 nova_compute[187283]: 2025-12-03 14:50:46.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:50:46 np0005544118 ovn_controller[95637]: 2025-12-03T14:50:46Z|00254|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:50:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:50:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:50:49 np0005544118 nova_compute[187283]: 2025-12-03 14:50:49.445 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:49 np0005544118 podman[220378]: 2025-12-03 14:50:49.81867633 +0000 UTC m=+0.052407734 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:50:51 np0005544118 nova_compute[187283]: 2025-12-03 14:50:51.127 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:54 np0005544118 nova_compute[187283]: 2025-12-03 14:50:54.448 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:55 np0005544118 podman[220402]: 2025-12-03 14:50:55.867355595 +0000 UTC m=+0.094033181 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:50:56 np0005544118 nova_compute[187283]: 2025-12-03 14:50:56.128 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.450 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.608 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.608 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.609 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.609 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.610 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.610 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.635 187287 DEBUG nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.635 187287 WARNING nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Unknown base file: /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.635 187287 INFO nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Removable base files: /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.636 187287 INFO nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.636 187287 DEBUG nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.636 187287 DEBUG nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Dec  3 09:50:59 np0005544118 nova_compute[187283]: 2025-12-03 14:50:59.637 187287 DEBUG nova.virt.libvirt.imagecache [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Dec  3 09:51:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:00.989 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:00.990 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:00.990 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:01 np0005544118 nova_compute[187283]: 2025-12-03 14:51:01.130 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:04 np0005544118 nova_compute[187283]: 2025-12-03 14:51:04.452 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:04 np0005544118 podman[220430]: 2025-12-03 14:51:04.838787185 +0000 UTC m=+0.063719535 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec  3 09:51:05 np0005544118 podman[197639]: time="2025-12-03T14:51:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:51:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:51:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:51:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:51:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec  3 09:51:06 np0005544118 nova_compute[187283]: 2025-12-03 14:51:06.132 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:08 np0005544118 podman[220451]: 2025-12-03 14:51:08.820496158 +0000 UTC m=+0.052243260 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:51:09 np0005544118 nova_compute[187283]: 2025-12-03 14:51:09.454 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:09 np0005544118 nova_compute[187283]: 2025-12-03 14:51:09.687 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Creating tmpfile /var/lib/nova/instances/tmprk133hab to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec  3 09:51:09 np0005544118 nova_compute[187283]: 2025-12-03 14:51:09.687 187287 DEBUG nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprk133hab',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec  3 09:51:11 np0005544118 nova_compute[187283]: 2025-12-03 14:51:11.134 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:12 np0005544118 nova_compute[187283]: 2025-12-03 14:51:12.360 187287 DEBUG nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprk133hab',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595be9f0-87f1-41d0-bdbd-539f6a8ec018',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec  3 09:51:12 np0005544118 nova_compute[187283]: 2025-12-03 14:51:12.405 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:51:12 np0005544118 nova_compute[187283]: 2025-12-03 14:51:12.406 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:51:12 np0005544118 nova_compute[187283]: 2025-12-03 14:51:12.406 187287 DEBUG nova.network.neutron [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.158 187287 DEBUG nova.network.neutron [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Updating instance_info_cache with network_info: [{"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.304 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.306 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprk133hab',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595be9f0-87f1-41d0-bdbd-539f6a8ec018',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.306 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Creating instance directory: /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.307 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Creating disk.info with the contents: {'/var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk': 'qcow2', '/var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.307 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.308 187287 DEBUG nova.objects.instance [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 595be9f0-87f1-41d0-bdbd-539f6a8ec018 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.335 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.410 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.411 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "ff2b4b2f577a57aded877ed6ce326762685b503b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.412 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.423 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.456 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.488 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.488 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.524 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b,backing_fmt=raw /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.525 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "ff2b4b2f577a57aded877ed6ce326762685b503b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.525 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.580 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/ff2b4b2f577a57aded877ed6ce326762685b503b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.581 187287 DEBUG nova.virt.disk.api [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Checking if we can resize image /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.582 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.641 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.643 187287 DEBUG nova.virt.disk.api [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Cannot resize image /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.643 187287 DEBUG nova.objects.instance [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lazy-loading 'migration_context' on Instance uuid 595be9f0-87f1-41d0-bdbd-539f6a8ec018 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.695 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.716 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config 485376" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.718 187287 DEBUG nova.virt.libvirt.volume.remotefs [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config to /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018 copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec  3 09:51:14 np0005544118 nova_compute[187283]: 2025-12-03 14:51:14.718 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.317 187287 DEBUG oslo_concurrency.processutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018/disk.config /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018" returned: 0 in 0.599s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.318 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.319 187287 DEBUG nova.virt.libvirt.vif [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-03T14:50:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1737275869',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1737275869',id=29,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3d44f73f0a904ada8d2928eb93138c1b',ramdisk_id='',reservation_id='r-w70vd0pt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-8520329',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-8520329-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-03T14:50:59Z,user_data=None,user_id='eac4d94280c04dff8a59ae8e5c542f6a',uuid=595be9f0-87f1-41d0-bdbd-539f6a8ec018,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.319 187287 DEBUG nova.network.os_vif_util [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converting VIF {"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.320 187287 DEBUG nova.network.os_vif_util [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.320 187287 DEBUG os_vif [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.321 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.322 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.322 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.325 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.325 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1749c32-12, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.326 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape1749c32-12, col_values=(('external_ids', {'iface-id': 'e1749c32-128f-4893-ac6b-8fc030d09c07', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ca:06:81', 'vm-uuid': '595be9f0-87f1-41d0-bdbd-539f6a8ec018'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.327 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.329 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:51:15 np0005544118 NetworkManager[55710]: <info>  [1764773475.3297] manager: (tape1749c32-12): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.334 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.334 187287 INFO os_vif [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12')#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.335 187287 DEBUG nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec  3 09:51:15 np0005544118 nova_compute[187283]: 2025-12-03 14:51:15.335 187287 DEBUG nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprk133hab',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595be9f0-87f1-41d0-bdbd-539f6a8ec018',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec  3 09:51:15 np0005544118 podman[220494]: 2025-12-03 14:51:15.822390785 +0000 UTC m=+0.054514720 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec  3 09:51:16 np0005544118 nova_compute[187283]: 2025-12-03 14:51:16.138 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:16.480 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:51:16 np0005544118 nova_compute[187283]: 2025-12-03 14:51:16.481 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:16 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:16.481 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.365 187287 DEBUG nova.network.neutron [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Port e1749c32-128f-4893-ac6b-8fc030d09c07 updated with migration profile {'migrating_to': 'compute-1.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.366 187287 DEBUG nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmprk133hab',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='595be9f0-87f1-41d0-bdbd-539f6a8ec018',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec  3 09:51:17 np0005544118 systemd[1]: Starting libvirt proxy daemon...
Dec  3 09:51:17 np0005544118 systemd[1]: Started libvirt proxy daemon.
Dec  3 09:51:17 np0005544118 kernel: tape1749c32-12: entered promiscuous mode
Dec  3 09:51:17 np0005544118 NetworkManager[55710]: <info>  [1764773477.7168] manager: (tape1749c32-12): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Dec  3 09:51:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:17Z|00255|binding|INFO|Claiming lport e1749c32-128f-4893-ac6b-8fc030d09c07 for this additional chassis.
Dec  3 09:51:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:17Z|00256|binding|INFO|e1749c32-128f-4893-ac6b-8fc030d09c07: Claiming fa:16:3e:ca:06:81 10.100.0.14
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.717 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.721 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.724 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:17 np0005544118 systemd-udevd[220547]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:51:17 np0005544118 systemd-machined[153602]: New machine qemu-24-instance-0000001d.
Dec  3 09:51:17 np0005544118 NetworkManager[55710]: <info>  [1764773477.7627] device (tape1749c32-12): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec  3 09:51:17 np0005544118 NetworkManager[55710]: <info>  [1764773477.7643] device (tape1749c32-12): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.775 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:17 np0005544118 systemd[1]: Started Virtual Machine qemu-24-instance-0000001d.
Dec  3 09:51:17 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:17Z|00257|binding|INFO|Setting lport e1749c32-128f-4893-ac6b-8fc030d09c07 ovn-installed in OVS
Dec  3 09:51:17 np0005544118 nova_compute[187283]: 2025-12-03 14:51:17.781 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:19 np0005544118 nova_compute[187283]: 2025-12-03 14:51:19.304 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773479.3041022, 595be9f0-87f1-41d0-bdbd-539f6a8ec018 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:51:19 np0005544118 nova_compute[187283]: 2025-12-03 14:51:19.305 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] VM Started (Lifecycle Event)#033[00m
Dec  3 09:51:19 np0005544118 nova_compute[187283]: 2025-12-03 14:51:19.334 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:51:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.327 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.589 187287 DEBUG nova.virt.driver [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] Emitting event <LifecycleEvent: 1764773480.58949, 595be9f0-87f1-41d0-bdbd-539f6a8ec018 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.590 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] VM Resumed (Lifecycle Event)#033[00m
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.609 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.613 187287 DEBUG nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec  3 09:51:20 np0005544118 nova_compute[187283]: 2025-12-03 14:51:20.632 187287 INFO nova.compute.manager [None req-3a6bbd01-9132-4bf0-9166-257c7da06002 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-1.ctlplane.example.com#033[00m
Dec  3 09:51:20 np0005544118 podman[220568]: 2025-12-03 14:51:20.856417706 +0000 UTC m=+0.076194727 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:51:21 np0005544118 nova_compute[187283]: 2025-12-03 14:51:21.140 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:22Z|00258|binding|INFO|Claiming lport e1749c32-128f-4893-ac6b-8fc030d09c07 for this chassis.
Dec  3 09:51:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:22Z|00259|binding|INFO|e1749c32-128f-4893-ac6b-8fc030d09c07: Claiming fa:16:3e:ca:06:81 10.100.0.14
Dec  3 09:51:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:22Z|00260|binding|INFO|Setting lport e1749c32-128f-4893-ac6b-8fc030d09c07 up in Southbound
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.419 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:06:81 10.100.0.14'], port_security=['fa:16:3e:ca:06:81 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '595be9f0-87f1-41d0-bdbd-539f6a8ec018', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d44f73f0a904ada8d2928eb93138c1b', 'neutron:revision_number': '11', 'neutron:security_group_ids': '672ab808-6000-42a9-9472-d2fcc9848347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e98f05f-4dc3-4cf3-9acb-4572ced6384f, chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=e1749c32-128f-4893-ac6b-8fc030d09c07) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.420 104491 INFO neutron.agent.ovn.metadata.agent [-] Port e1749c32-128f-4893-ac6b-8fc030d09c07 in datapath 3acc3251-99c7-4dd7-8875-b3546fb1889c bound to our chassis#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.421 104491 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3acc3251-99c7-4dd7-8875-b3546fb1889c#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.431 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[c01cdbc4-b0e3-4a90-b1f3-133824f1b3da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.432 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3acc3251-91 in ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.435 208813 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3acc3251-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.435 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[22518a54-641b-467f-8194-718d60f73636]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.436 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[573afed2-0210-4346-a41f-c805c34bd58d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.451 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[b1777afb-cc2b-488b-a942-10f31ae58036]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.465 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[35743e13-488d-411d-b92a-5551ce2d7367]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.483 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.491 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[d05e5b0f-6469-425d-9211-c924d452ebba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 NetworkManager[55710]: <info>  [1764773482.4989] manager: (tap3acc3251-90): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.498 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[81fcc7d2-34d5-451d-a65b-dd363a1c448b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 systemd-udevd[220599]: Network interface NamePolicy= disabled on kernel command line.
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.535 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[720e6b03-b2cc-4cce-aa7f-589fd0169601]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.538 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[9899aacb-931d-465f-81e3-07201101e164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 NetworkManager[55710]: <info>  [1764773482.5612] device (tap3acc3251-90): carrier: link connected
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.565 208827 DEBUG oslo.privsep.daemon [-] privsep: reply[a039b317-adfd-426a-be2d-501ca8e668b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.581 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[a7dd080d-7e29-439e-b37b-9e1f4a154afe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3acc3251-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:45:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563816, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220618, 'error': None, 'target': 'ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.596 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[edeba68b-8f86-4374-bed5-5448c54d5274]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feda:4557'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 563816, 'tstamp': 563816}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220619, 'error': None, 'target': 'ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.611 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[4d4b3b7d-1b93-4f45-b647-87c512dbb34e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3acc3251-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:da:45:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563816, 'reachable_time': 22266, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220620, 'error': None, 'target': 'ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.642 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[582cc4ac-5093-476d-90c6-9f992f1a0329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 nova_compute[187283]: 2025-12-03 14:51:22.656 187287 INFO nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Post operation of migration started#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.699 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[f576fd22-df51-4f3f-ae07-20137904f95c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.700 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3acc3251-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.700 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.700 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3acc3251-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:22 np0005544118 NetworkManager[55710]: <info>  [1764773482.7030] manager: (tap3acc3251-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Dec  3 09:51:22 np0005544118 nova_compute[187283]: 2025-12-03 14:51:22.702 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:22 np0005544118 kernel: tap3acc3251-90: entered promiscuous mode
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.705 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3acc3251-90, col_values=(('external_ids', {'iface-id': 'd25d76a2-032e-447b-bed8-d34383d7549b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:22 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:22Z|00261|binding|INFO|Releasing lport d25d76a2-032e-447b-bed8-d34383d7549b from this chassis (sb_readonly=0)
Dec  3 09:51:22 np0005544118 nova_compute[187283]: 2025-12-03 14:51:22.717 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.718 104491 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3acc3251-99c7-4dd7-8875-b3546fb1889c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3acc3251-99c7-4dd7-8875-b3546fb1889c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.719 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[b09110eb-ddf5-4c4e-8252-548e7dea5011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.719 104491 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: global
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    log         /dev/log local0 debug
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    log-tag     haproxy-metadata-proxy-3acc3251-99c7-4dd7-8875-b3546fb1889c
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    user        root
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    group       root
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    maxconn     1024
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    pidfile     /var/lib/neutron/external/pids/3acc3251-99c7-4dd7-8875-b3546fb1889c.pid.haproxy
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    daemon
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: defaults
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    log global
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    mode http
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    option httplog
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    option dontlognull
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    option http-server-close
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    option forwardfor
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    retries                 3
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    timeout http-request    30s
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    timeout connect         30s
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    timeout client          32s
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    timeout server          32s
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    timeout http-keep-alive 30s
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: listen listener
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    bind 169.254.169.254:80
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    server metadata /var/lib/neutron/metadata_proxy
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]:    http-request add-header X-OVN-Network-ID 3acc3251-99c7-4dd7-8875-b3546fb1889c
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec  3 09:51:22 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:22.720 104491 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'env', 'PROCESS_TAG=haproxy-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3acc3251-99c7-4dd7-8875-b3546fb1889c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec  3 09:51:23 np0005544118 nova_compute[187283]: 2025-12-03 14:51:23.074 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec  3 09:51:23 np0005544118 nova_compute[187283]: 2025-12-03 14:51:23.075 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquired lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec  3 09:51:23 np0005544118 nova_compute[187283]: 2025-12-03 14:51:23.075 187287 DEBUG nova.network.neutron [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec  3 09:51:23 np0005544118 podman[220654]: 2025-12-03 14:51:23.079924326 +0000 UTC m=+0.050148584 container create edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:51:23 np0005544118 systemd[1]: Started libpod-conmon-edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06.scope.
Dec  3 09:51:23 np0005544118 podman[220654]: 2025-12-03 14:51:23.051776338 +0000 UTC m=+0.022000616 image pull 014dc726c85414b29f2dde7b5d875685d08784761c0f0ffa8630d1583a877bf9 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec  3 09:51:23 np0005544118 systemd[1]: Started libcrun container.
Dec  3 09:51:23 np0005544118 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2313f530d7939185c63cd515468468b8507ecfca618a733491516d790ce26fbe/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec  3 09:51:23 np0005544118 podman[220654]: 2025-12-03 14:51:23.168167951 +0000 UTC m=+0.138392229 container init edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, tcib_managed=true, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:51:23 np0005544118 podman[220654]: 2025-12-03 14:51:23.174414596 +0000 UTC m=+0.144638854 container start edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, io.buildah.version=1.41.3, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec  3 09:51:23 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [NOTICE]   (220674) : New worker (220676) forked
Dec  3 09:51:23 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [NOTICE]   (220674) : Loading success.
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.173 187287 DEBUG nova.network.neutron [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Updating instance_info_cache with network_info: [{"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.689 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Releasing lock "refresh_cache-595be9f0-87f1-41d0-bdbd-539f6a8ec018" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.763 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.764 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.764 187287 DEBUG oslo_concurrency.lockutils [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:24 np0005544118 nova_compute[187283]: 2025-12-03 14:51:24.768 187287 INFO nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec  3 09:51:24 np0005544118 virtqemud[186958]: Domain id=24 name='instance-0000001d' uuid=595be9f0-87f1-41d0-bdbd-539f6a8ec018 is tainted: custom-monitor
Dec  3 09:51:25 np0005544118 nova_compute[187283]: 2025-12-03 14:51:25.329 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:25 np0005544118 nova_compute[187283]: 2025-12-03 14:51:25.775 187287 INFO nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec  3 09:51:26 np0005544118 nova_compute[187283]: 2025-12-03 14:51:26.141 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:26 np0005544118 nova_compute[187283]: 2025-12-03 14:51:26.780 187287 INFO nova.virt.libvirt.driver [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec  3 09:51:26 np0005544118 nova_compute[187283]: 2025-12-03 14:51:26.784 187287 DEBUG nova.compute.manager [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:51:26 np0005544118 podman[220685]: 2025-12-03 14:51:26.847095278 +0000 UTC m=+0.081952129 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:51:26 np0005544118 nova_compute[187283]: 2025-12-03 14:51:26.849 187287 DEBUG nova.objects.instance [None req-caacf482-41a1-4313-8464-1c37c6c0926c b0fed23c11ca46d79fd5f194bd3c25cc 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec  3 09:51:29 np0005544118 nova_compute[187283]: 2025-12-03 14:51:29.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:30 np0005544118 nova_compute[187283]: 2025-12-03 14:51:30.333 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:30 np0005544118 nova_compute[187283]: 2025-12-03 14:51:30.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.151 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.993 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Acquiring lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.994 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.994 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Acquiring lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.994 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.995 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.996 187287 INFO nova.compute.manager [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Terminating instance#033[00m
Dec  3 09:51:31 np0005544118 nova_compute[187283]: 2025-12-03 14:51:31.997 187287 DEBUG nova.compute.manager [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec  3 09:51:32 np0005544118 kernel: tape1749c32-12 (unregistering): left promiscuous mode
Dec  3 09:51:32 np0005544118 NetworkManager[55710]: <info>  [1764773492.0246] device (tape1749c32-12): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.031 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:32Z|00262|binding|INFO|Releasing lport e1749c32-128f-4893-ac6b-8fc030d09c07 from this chassis (sb_readonly=0)
Dec  3 09:51:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:32Z|00263|binding|INFO|Setting lport e1749c32-128f-4893-ac6b-8fc030d09c07 down in Southbound
Dec  3 09:51:32 np0005544118 ovn_controller[95637]: 2025-12-03T14:51:32Z|00264|binding|INFO|Removing iface tape1749c32-12 ovn-installed in OVS
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.033 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.040 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ca:06:81 10.100.0.14'], port_security=['fa:16:3e:ca:06:81 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '595be9f0-87f1-41d0-bdbd-539f6a8ec018', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3d44f73f0a904ada8d2928eb93138c1b', 'neutron:revision_number': '13', 'neutron:security_group_ids': '672ab808-6000-42a9-9472-d2fcc9848347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e98f05f-4dc3-4cf3-9acb-4572ced6384f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>], logical_port=e1749c32-128f-4893-ac6b-8fc030d09c07) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fe6f4b90b50>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.042 104491 INFO neutron.agent.ovn.metadata.agent [-] Port e1749c32-128f-4893-ac6b-8fc030d09c07 in datapath 3acc3251-99c7-4dd7-8875-b3546fb1889c unbound from our chassis#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.043 104491 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3acc3251-99c7-4dd7-8875-b3546fb1889c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.044 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[48fb7abc-7ede-45d9-9e39-ff6572940622]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.044 104491 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c namespace which is not needed anymore#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.049 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Dec  3 09:51:32 np0005544118 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001d.scope: Consumed 2.522s CPU time.
Dec  3 09:51:32 np0005544118 systemd-machined[153602]: Machine qemu-24-instance-0000001d terminated.
Dec  3 09:51:32 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [NOTICE]   (220674) : haproxy version is 2.8.14-c23fe91
Dec  3 09:51:32 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [NOTICE]   (220674) : path to executable is /usr/sbin/haproxy
Dec  3 09:51:32 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [WARNING]  (220674) : Exiting Master process...
Dec  3 09:51:32 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [ALERT]    (220674) : Current worker (220676) exited with code 143 (Terminated)
Dec  3 09:51:32 np0005544118 neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c[220670]: [WARNING]  (220674) : All workers exited. Exiting... (0)
Dec  3 09:51:32 np0005544118 systemd[1]: libpod-edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06.scope: Deactivated successfully.
Dec  3 09:51:32 np0005544118 podman[220736]: 2025-12-03 14:51:32.17583372 +0000 UTC m=+0.042388717 container died edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Dec  3 09:51:32 np0005544118 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06-userdata-shm.mount: Deactivated successfully.
Dec  3 09:51:32 np0005544118 systemd[1]: var-lib-containers-storage-overlay-2313f530d7939185c63cd515468468b8507ecfca618a733491516d790ce26fbe-merged.mount: Deactivated successfully.
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.216 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.219 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 podman[220736]: 2025-12-03 14:51:32.220330353 +0000 UTC m=+0.086885350 container cleanup edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:51:32 np0005544118 systemd[1]: libpod-conmon-edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06.scope: Deactivated successfully.
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.253 187287 INFO nova.virt.libvirt.driver [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Instance destroyed successfully.#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.254 187287 DEBUG nova.objects.instance [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lazy-loading 'resources' on Instance uuid 595be9f0-87f1-41d0-bdbd-539f6a8ec018 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec  3 09:51:32 np0005544118 podman[220776]: 2025-12-03 14:51:32.286976403 +0000 UTC m=+0.041737829 container remove edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.291 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf5cd9e-30a2-496a-93e2-91903bb66c04]: (4, ('Wed Dec  3 02:51:32 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c (edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06)\nedf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06\nWed Dec  3 02:51:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c (edf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06)\nedf340e3b24f201a362a864a3dc5e06f4c35ea92c0009a7495b0752bfce63d06\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.293 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[38082230-0c05-4b1d-ab2f-dbbe7f39f2d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.294 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3acc3251-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.296 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 kernel: tap3acc3251-90: left promiscuous mode
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.310 187287 DEBUG nova.virt.libvirt.vif [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-03T14:50:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-1737275869',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-1737275869',id=29,image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-03T14:50:59Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3d44f73f0a904ada8d2928eb93138c1b',ramdisk_id='',reservation_id='r-w70vd0pt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='c4df1e47-ea6c-486a-a6b4-60f325b44502',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-8520329',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-8520329-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-03T14:51:26Z,user_data=None,user_id='eac4d94280c04dff8a59ae8e5c542f6a',uuid=595be9f0-87f1-41d0-bdbd-539f6a8ec018,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.310 187287 DEBUG nova.network.os_vif_util [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Converting VIF {"id": "e1749c32-128f-4893-ac6b-8fc030d09c07", "address": "fa:16:3e:ca:06:81", "network": {"id": "3acc3251-99c7-4dd7-8875-b3546fb1889c", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-229839387-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3d44f73f0a904ada8d2928eb93138c1b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1749c32-12", "ovs_interfaceid": "e1749c32-128f-4893-ac6b-8fc030d09c07", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.311 187287 DEBUG nova.network.os_vif_util [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.311 187287 DEBUG os_vif [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.312 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.313 187287 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1749c32-12, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.313 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[ba4b4dfa-a10f-4af2-a684-e2dca4e27ba9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.314 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.314 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.353 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.356 187287 INFO os_vif [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ca:06:81,bridge_name='br-int',has_traffic_filtering=True,id=e1749c32-128f-4893-ac6b-8fc030d09c07,network=Network(3acc3251-99c7-4dd7-8875-b3546fb1889c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1749c32-12')#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.357 187287 INFO nova.virt.libvirt.driver [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Deleting instance files /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018_del#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.357 187287 INFO nova.virt.libvirt.driver [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Deletion of /var/lib/nova/instances/595be9f0-87f1-41d0-bdbd-539f6a8ec018_del complete#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.366 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[e81353a0-2dfb-4b7e-8751-ab2c238028da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.368 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[5875c69e-200c-4fd2-8a04-ad732a1feead]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.386 208813 DEBUG oslo.privsep.daemon [-] privsep: reply[8c90d42a-8810-4c8d-a2b6-8f4384850483]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 563809, 'reachable_time': 31356, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220798, 'error': None, 'target': 'ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.389 104605 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3acc3251-99c7-4dd7-8875-b3546fb1889c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec  3 09:51:32 np0005544118 systemd[1]: run-netns-ovnmeta\x2d3acc3251\x2d99c7\x2d4dd7\x2d8875\x2db3546fb1889c.mount: Deactivated successfully.
Dec  3 09:51:32 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:51:32.390 104605 DEBUG oslo.privsep.daemon [-] privsep: reply[24e8e921-5395-4e51-9041-7675303fea59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.425 187287 INFO nova.compute.manager [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Took 0.43 seconds to destroy the instance on the hypervisor.#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.426 187287 DEBUG oslo.service.loopingcall [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.427 187287 DEBUG nova.compute.manager [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.428 187287 DEBUG nova.network.neutron [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec  3 09:51:32 np0005544118 nova_compute[187283]: 2025-12-03 14:51:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:34 np0005544118 nova_compute[187283]: 2025-12-03 14:51:34.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:34 np0005544118 nova_compute[187283]: 2025-12-03 14:51:34.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.133 187287 DEBUG nova.compute.manager [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Received event network-vif-unplugged-e1749c32-128f-4893-ac6b-8fc030d09c07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.134 187287 DEBUG oslo_concurrency.lockutils [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.134 187287 DEBUG oslo_concurrency.lockutils [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.134 187287 DEBUG oslo_concurrency.lockutils [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.134 187287 DEBUG nova.compute.manager [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] No waiting events found dispatching network-vif-unplugged-e1749c32-128f-4893-ac6b-8fc030d09c07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.135 187287 DEBUG nova.compute.manager [req-1556a285-9fc8-4557-8eee-d94371afecf2 req-07238c9a-8e2d-4223-92a4-623dd4c8c5ef c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Received event network-vif-unplugged-e1749c32-128f-4893-ac6b-8fc030d09c07 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec  3 09:51:35 np0005544118 podman[197639]: time="2025-12-03T14:51:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:51:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:51:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:51:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:51:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2598 "" "Go-http-client/1.1"
Dec  3 09:51:35 np0005544118 nova_compute[187283]: 2025-12-03 14:51:35.668 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:35 np0005544118 podman[220801]: 2025-12-03 14:51:35.828385977 +0000 UTC m=+0.054485619 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41)
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.156 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.932 187287 DEBUG nova.network.neutron [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.948 187287 INFO nova.compute.manager [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Took 4.52 seconds to deallocate network for instance.#033[00m
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.983 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.984 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:36 np0005544118 nova_compute[187283]: 2025-12-03 14:51:36.989 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.016 187287 INFO nova.scheduler.client.report [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Deleted allocations for instance 595be9f0-87f1-41d0-bdbd-539f6a8ec018#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.105 187287 DEBUG oslo_concurrency.lockutils [None req-a6cc0d98-60a5-4d4c-86ca-898735357565 eac4d94280c04dff8a59ae8e5c542f6a 3d44f73f0a904ada8d2928eb93138c1b - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.233 187287 DEBUG nova.compute.manager [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Received event network-vif-plugged-e1749c32-128f-4893-ac6b-8fc030d09c07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.234 187287 DEBUG oslo_concurrency.lockutils [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Acquiring lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.234 187287 DEBUG oslo_concurrency.lockutils [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.234 187287 DEBUG oslo_concurrency.lockutils [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] Lock "595be9f0-87f1-41d0-bdbd-539f6a8ec018-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.234 187287 DEBUG nova.compute.manager [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] No waiting events found dispatching network-vif-plugged-e1749c32-128f-4893-ac6b-8fc030d09c07 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.235 187287 WARNING nova.compute.manager [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Received unexpected event network-vif-plugged-e1749c32-128f-4893-ac6b-8fc030d09c07 for instance with vm_state deleted and task_state None.#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.235 187287 DEBUG nova.compute.manager [req-7927d239-dc2d-473a-98be-1cb500414697 req-bb798abf-257e-482c-aff7-4e8277ead0fc c1793c36b4194567b9a8952b68c5ac77 062c7be2a68145929adbf86b30df976a - - default default] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Received event network-vif-deleted-e1749c32-128f-4893-ac6b-8fc030d09c07 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec  3 09:51:37 np0005544118 nova_compute[187283]: 2025-12-03 14:51:37.314 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:38 np0005544118 nova_compute[187283]: 2025-12-03 14:51:38.612 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.636 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:51:39 np0005544118 podman[220824]: 2025-12-03 14:51:39.738756696 +0000 UTC m=+0.067320981 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.815 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.817 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5843MB free_disk=73.33415985107422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.817 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.817 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.989 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:51:39 np0005544118 nova_compute[187283]: 2025-12-03 14:51:39.990 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:51:40 np0005544118 nova_compute[187283]: 2025-12-03 14:51:40.028 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:51:40 np0005544118 nova_compute[187283]: 2025-12-03 14:51:40.043 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:51:40 np0005544118 nova_compute[187283]: 2025-12-03 14:51:40.044 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:51:40 np0005544118 nova_compute[187283]: 2025-12-03 14:51:40.044 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.227s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:51:41 np0005544118 nova_compute[187283]: 2025-12-03 14:51:41.158 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.039 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.315 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.609 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:51:42 np0005544118 nova_compute[187283]: 2025-12-03 14:51:42.628 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:51:43 np0005544118 nova_compute[187283]: 2025-12-03 14:51:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:43 np0005544118 nova_compute[187283]: 2025-12-03 14:51:43.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:51:43 np0005544118 nova_compute[187283]: 2025-12-03 14:51:43.627 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:51:44 np0005544118 nova_compute[187283]: 2025-12-03 14:51:44.624 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:45 np0005544118 nova_compute[187283]: 2025-12-03 14:51:45.751 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:46 np0005544118 nova_compute[187283]: 2025-12-03 14:51:46.171 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:46 np0005544118 podman[220845]: 2025-12-03 14:51:46.819239892 +0000 UTC m=+0.051226203 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.252 187287 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764773492.251346, 595be9f0-87f1-41d0-bdbd-539f6a8ec018 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.252 187287 INFO nova.compute.manager [-] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] VM Stopped (Lifecycle Event)#033[00m
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.274 187287 DEBUG nova.compute.manager [None req-634ea3e2-a73a-4acc-ac4a-531b6d2a4cc2 - - - - - -] [instance: 595be9f0-87f1-41d0-bdbd-539f6a8ec018] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.337 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:47 np0005544118 nova_compute[187283]: 2025-12-03 14:51:47.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:51:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:51:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:51:51 np0005544118 nova_compute[187283]: 2025-12-03 14:51:51.175 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:51 np0005544118 nova_compute[187283]: 2025-12-03 14:51:51.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:51:51 np0005544118 podman[220865]: 2025-12-03 14:51:51.827441436 +0000 UTC m=+0.054551181 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:51:52 np0005544118 nova_compute[187283]: 2025-12-03 14:51:52.340 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:56 np0005544118 nova_compute[187283]: 2025-12-03 14:51:56.176 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:57 np0005544118 nova_compute[187283]: 2025-12-03 14:51:57.343 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:51:57 np0005544118 podman[220890]: 2025-12-03 14:51:57.84675983 +0000 UTC m=+0.079517665 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:52:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:00.991 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:52:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:00.992 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:52:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:00.992 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:52:01 np0005544118 nova_compute[187283]: 2025-12-03 14:52:01.225 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:02 np0005544118 nova_compute[187283]: 2025-12-03 14:52:02.374 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:05 np0005544118 podman[197639]: time="2025-12-03T14:52:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:52:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:52:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:52:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:52:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 09:52:06 np0005544118 nova_compute[187283]: 2025-12-03 14:52:06.228 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:06 np0005544118 podman[220916]: 2025-12-03 14:52:06.826759244 +0000 UTC m=+0.060880488 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.6)
Dec  3 09:52:06 np0005544118 ovn_controller[95637]: 2025-12-03T14:52:06Z|00265|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec  3 09:52:07 np0005544118 nova_compute[187283]: 2025-12-03 14:52:07.376 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:10 np0005544118 podman[220937]: 2025-12-03 14:52:10.827406492 +0000 UTC m=+0.057972882 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  3 09:52:11 np0005544118 nova_compute[187283]: 2025-12-03 14:52:11.231 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:12 np0005544118 nova_compute[187283]: 2025-12-03 14:52:12.415 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:13.075 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:52:13 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:13.075 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:52:13 np0005544118 nova_compute[187283]: 2025-12-03 14:52:13.076 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:16 np0005544118 nova_compute[187283]: 2025-12-03 14:52:16.233 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:17 np0005544118 nova_compute[187283]: 2025-12-03 14:52:17.417 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:17 np0005544118 nova_compute[187283]: 2025-12-03 14:52:17.424 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:17 np0005544118 podman[220958]: 2025-12-03 14:52:17.817841584 +0000 UTC m=+0.054799948 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_metadata_agent)
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:52:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:52:21 np0005544118 nova_compute[187283]: 2025-12-03 14:52:21.236 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:22 np0005544118 nova_compute[187283]: 2025-12-03 14:52:22.420 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:22 np0005544118 podman[220980]: 2025-12-03 14:52:22.83066938 +0000 UTC m=+0.057463038 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:52:23 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:52:23.077 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:52:26 np0005544118 nova_compute[187283]: 2025-12-03 14:52:26.237 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:27 np0005544118 nova_compute[187283]: 2025-12-03 14:52:27.421 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:28 np0005544118 podman[221009]: 2025-12-03 14:52:28.840137463 +0000 UTC m=+0.074957772 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec  3 09:52:29 np0005544118 nova_compute[187283]: 2025-12-03 14:52:29.622 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:31 np0005544118 nova_compute[187283]: 2025-12-03 14:52:31.265 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:32 np0005544118 nova_compute[187283]: 2025-12-03 14:52:32.455 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:32 np0005544118 nova_compute[187283]: 2025-12-03 14:52:32.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:32 np0005544118 nova_compute[187283]: 2025-12-03 14:52:32.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:35 np0005544118 podman[197639]: time="2025-12-03T14:52:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:52:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:52:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:52:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:52:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 09:52:36 np0005544118 nova_compute[187283]: 2025-12-03 14:52:36.267 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:36 np0005544118 nova_compute[187283]: 2025-12-03 14:52:36.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:37 np0005544118 nova_compute[187283]: 2025-12-03 14:52:37.456 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:37 np0005544118 podman[221038]: 2025-12-03 14:52:37.828574803 +0000 UTC m=+0.055212598 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Dec  3 09:52:38 np0005544118 nova_compute[187283]: 2025-12-03 14:52:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.270 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:41 np0005544118 podman[221059]: 2025-12-03 14:52:41.822563255 +0000 UTC m=+0.055912718 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.833 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.975 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.977 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5865MB free_disk=73.33415985107422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.977 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:52:41 np0005544118 nova_compute[187283]: 2025-12-03 14:52:41.977 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.100 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.100 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.165 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.366 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.367 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.368 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:52:42 np0005544118 nova_compute[187283]: 2025-12-03 14:52:42.460 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:43 np0005544118 nova_compute[187283]: 2025-12-03 14:52:43.369 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:43 np0005544118 nova_compute[187283]: 2025-12-03 14:52:43.370 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:52:43 np0005544118 nova_compute[187283]: 2025-12-03 14:52:43.370 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:52:43 np0005544118 nova_compute[187283]: 2025-12-03 14:52:43.386 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:52:46 np0005544118 nova_compute[187283]: 2025-12-03 14:52:46.272 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:47 np0005544118 nova_compute[187283]: 2025-12-03 14:52:47.463 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:48 np0005544118 podman[221079]: 2025-12-03 14:52:48.809191486 +0000 UTC m=+0.045386208 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:52:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:52:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:52:49 np0005544118 nova_compute[187283]: 2025-12-03 14:52:49.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:52:49 np0005544118 nova_compute[187283]: 2025-12-03 14:52:49.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:52:51 np0005544118 nova_compute[187283]: 2025-12-03 14:52:51.274 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:52 np0005544118 nova_compute[187283]: 2025-12-03 14:52:52.466 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:52 np0005544118 ovn_controller[95637]: 2025-12-03T14:52:52Z|00266|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec  3 09:52:53 np0005544118 podman[221098]: 2025-12-03 14:52:53.848076264 +0000 UTC m=+0.067346780 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:52:56 np0005544118 nova_compute[187283]: 2025-12-03 14:52:56.275 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:57 np0005544118 nova_compute[187283]: 2025-12-03 14:52:57.506 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:52:59 np0005544118 podman[221125]: 2025-12-03 14:52:59.865320504 +0000 UTC m=+0.096259100 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:53:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:53:00.993 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:53:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:53:00.994 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:53:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:53:00.994 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:53:01 np0005544118 nova_compute[187283]: 2025-12-03 14:53:01.277 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:02 np0005544118 nova_compute[187283]: 2025-12-03 14:53:02.509 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:05 np0005544118 podman[197639]: time="2025-12-03T14:53:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:53:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:53:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:53:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:53:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  3 09:53:06 np0005544118 nova_compute[187283]: 2025-12-03 14:53:06.281 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:07 np0005544118 nova_compute[187283]: 2025-12-03 14:53:07.511 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:08 np0005544118 podman[221152]: 2025-12-03 14:53:08.843383768 +0000 UTC m=+0.062314656 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec  3 09:53:11 np0005544118 nova_compute[187283]: 2025-12-03 14:53:11.324 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:12 np0005544118 systemd-logind[795]: New session 36 of user zuul.
Dec  3 09:53:12 np0005544118 systemd[1]: Started Session 36 of User zuul.
Dec  3 09:53:12 np0005544118 podman[221175]: 2025-12-03 14:53:12.240595501 +0000 UTC m=+0.057656074 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 09:53:12 np0005544118 nova_compute[187283]: 2025-12-03 14:53:12.513 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:16 np0005544118 nova_compute[187283]: 2025-12-03 14:53:16.326 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:17 np0005544118 nova_compute[187283]: 2025-12-03 14:53:17.515 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:18 np0005544118 ovs-vsctl[221412]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  3 09:53:18 np0005544118 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 221222 (sos)
Dec  3 09:53:18 np0005544118 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec  3 09:53:18 np0005544118 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec  3 09:53:18 np0005544118 podman[221460]: 2025-12-03 14:53:18.954972126 +0000 UTC m=+0.094540443 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:53:19 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  3 09:53:19 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  3 09:53:19 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:53:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:53:20 np0005544118 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec  3 09:53:21 np0005544118 nova_compute[187283]: 2025-12-03 14:53:21.328 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:22 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 09:53:22 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 09:53:22 np0005544118 nova_compute[187283]: 2025-12-03 14:53:22.517 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:24 np0005544118 podman[222195]: 2025-12-03 14:53:24.407424685 +0000 UTC m=+0.055454234 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:53:26 np0005544118 nova_compute[187283]: 2025-12-03 14:53:26.329 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:27 np0005544118 nova_compute[187283]: 2025-12-03 14:53:27.519 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:28 np0005544118 ovs-appctl[223133]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  3 09:53:28 np0005544118 ovs-appctl[223138]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  3 09:53:28 np0005544118 ovs-appctl[223149]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec  3 09:53:30 np0005544118 podman[223887]: 2025-12-03 14:53:30.866019354 +0000 UTC m=+0.086016826 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Dec  3 09:53:31 np0005544118 nova_compute[187283]: 2025-12-03 14:53:31.331 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:31 np0005544118 nova_compute[187283]: 2025-12-03 14:53:31.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:32 np0005544118 nova_compute[187283]: 2025-12-03 14:53:32.521 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:33 np0005544118 nova_compute[187283]: 2025-12-03 14:53:33.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:34 np0005544118 nova_compute[187283]: 2025-12-03 14:53:34.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:35 np0005544118 podman[197639]: time="2025-12-03T14:53:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:53:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:53:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:53:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:53:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2601 "" "Go-http-client/1.1"
Dec  3 09:53:35 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  3 09:53:36 np0005544118 nova_compute[187283]: 2025-12-03 14:53:36.332 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:36 np0005544118 nova_compute[187283]: 2025-12-03 14:53:36.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:37 np0005544118 nova_compute[187283]: 2025-12-03 14:53:37.523 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:37 np0005544118 systemd[1]: Starting Time & Date Service...
Dec  3 09:53:38 np0005544118 systemd[1]: Started Time & Date Service.
Dec  3 09:53:39 np0005544118 nova_compute[187283]: 2025-12-03 14:53:39.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:39 np0005544118 podman[224559]: 2025-12-03 14:53:39.840950514 +0000 UTC m=+0.070043232 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.336 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.639 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.640 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.640 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.640 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.886 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.888 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5497MB free_disk=72.9000015258789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.889 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.889 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.960 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:53:41 np0005544118 nova_compute[187283]: 2025-12-03 14:53:41.961 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:53:42 np0005544118 nova_compute[187283]: 2025-12-03 14:53:42.051 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:53:42 np0005544118 nova_compute[187283]: 2025-12-03 14:53:42.072 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:53:42 np0005544118 nova_compute[187283]: 2025-12-03 14:53:42.105 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:53:42 np0005544118 nova_compute[187283]: 2025-12-03 14:53:42.106 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.217s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:53:42 np0005544118 nova_compute[187283]: 2025-12-03 14:53:42.526 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:42 np0005544118 podman[224587]: 2025-12-03 14:53:42.558501944 +0000 UTC m=+0.095134629 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:53:43 np0005544118 nova_compute[187283]: 2025-12-03 14:53:43.102 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:43 np0005544118 nova_compute[187283]: 2025-12-03 14:53:43.611 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:43 np0005544118 nova_compute[187283]: 2025-12-03 14:53:43.611 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:53:43 np0005544118 nova_compute[187283]: 2025-12-03 14:53:43.612 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:53:43 np0005544118 nova_compute[187283]: 2025-12-03 14:53:43.632 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:53:46 np0005544118 nova_compute[187283]: 2025-12-03 14:53:46.339 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:47 np0005544118 nova_compute[187283]: 2025-12-03 14:53:47.556 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:53:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:53:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:53:49 np0005544118 nova_compute[187283]: 2025-12-03 14:53:49.609 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:49 np0005544118 nova_compute[187283]: 2025-12-03 14:53:49.642 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:53:49 np0005544118 nova_compute[187283]: 2025-12-03 14:53:49.642 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:53:49 np0005544118 podman[224612]: 2025-12-03 14:53:49.817890734 +0000 UTC m=+0.049139187 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 09:53:51 np0005544118 nova_compute[187283]: 2025-12-03 14:53:51.339 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:52 np0005544118 nova_compute[187283]: 2025-12-03 14:53:52.594 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:54 np0005544118 podman[224631]: 2025-12-03 14:53:54.859299959 +0000 UTC m=+0.092970421 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:53:56 np0005544118 nova_compute[187283]: 2025-12-03 14:53:56.343 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Session 36 logged out. Waiting for processes to exit.
Dec  3 09:53:57 np0005544118 systemd[1]: session-36.scope: Deactivated successfully.
Dec  3 09:53:57 np0005544118 systemd[1]: session-36.scope: Consumed 1min 13.466s CPU time, 499.9M memory peak, read 102.8M from disk, written 15.9M to disk.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Removed session 36.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: New session 37 of user zuul.
Dec  3 09:53:57 np0005544118 systemd[1]: Started Session 37 of User zuul.
Dec  3 09:53:57 np0005544118 nova_compute[187283]: 2025-12-03 14:53:57.595 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:53:57 np0005544118 systemd[1]: session-37.scope: Deactivated successfully.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Session 37 logged out. Waiting for processes to exit.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Removed session 37.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: New session 38 of user zuul.
Dec  3 09:53:57 np0005544118 systemd[1]: Started Session 38 of User zuul.
Dec  3 09:53:57 np0005544118 systemd[1]: session-38.scope: Deactivated successfully.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Session 38 logged out. Waiting for processes to exit.
Dec  3 09:53:57 np0005544118 systemd-logind[795]: Removed session 38.
Dec  3 09:54:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:54:00.994 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:54:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:54:00.995 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:54:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:54:00.995 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:54:01 np0005544118 nova_compute[187283]: 2025-12-03 14:54:01.344 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:01 np0005544118 podman[224714]: 2025-12-03 14:54:01.886454417 +0000 UTC m=+0.117612346 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec  3 09:54:02 np0005544118 nova_compute[187283]: 2025-12-03 14:54:02.598 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:05 np0005544118 podman[197639]: time="2025-12-03T14:54:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:54:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:54:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:54:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:54:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec  3 09:54:06 np0005544118 nova_compute[187283]: 2025-12-03 14:54:06.346 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:07 np0005544118 nova_compute[187283]: 2025-12-03 14:54:07.601 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:08 np0005544118 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec  3 09:54:08 np0005544118 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec  3 09:54:10 np0005544118 podman[224746]: 2025-12-03 14:54:10.846979745 +0000 UTC m=+0.079988667 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, release=1755695350, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  3 09:54:11 np0005544118 nova_compute[187283]: 2025-12-03 14:54:11.348 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:12 np0005544118 nova_compute[187283]: 2025-12-03 14:54:12.604 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:12 np0005544118 podman[224767]: 2025-12-03 14:54:12.820286795 +0000 UTC m=+0.056584474 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec  3 09:54:16 np0005544118 nova_compute[187283]: 2025-12-03 14:54:16.350 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:17 np0005544118 nova_compute[187283]: 2025-12-03 14:54:17.605 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:54:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:54:20 np0005544118 podman[224788]: 2025-12-03 14:54:20.849204516 +0000 UTC m=+0.072507279 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:54:21 np0005544118 nova_compute[187283]: 2025-12-03 14:54:21.352 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:22 np0005544118 nova_compute[187283]: 2025-12-03 14:54:22.608 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:25 np0005544118 podman[224808]: 2025-12-03 14:54:25.837895101 +0000 UTC m=+0.071309126 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:54:26 np0005544118 nova_compute[187283]: 2025-12-03 14:54:26.353 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:27 np0005544118 nova_compute[187283]: 2025-12-03 14:54:27.610 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:31 np0005544118 nova_compute[187283]: 2025-12-03 14:54:31.358 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:31 np0005544118 nova_compute[187283]: 2025-12-03 14:54:31.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:32 np0005544118 nova_compute[187283]: 2025-12-03 14:54:32.613 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:32 np0005544118 podman[224833]: 2025-12-03 14:54:32.887778313 +0000 UTC m=+0.115294705 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible)
Dec  3 09:54:35 np0005544118 nova_compute[187283]: 2025-12-03 14:54:35.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:35 np0005544118 podman[197639]: time="2025-12-03T14:54:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:54:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:54:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:54:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:54:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 09:54:36 np0005544118 nova_compute[187283]: 2025-12-03 14:54:36.360 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:36 np0005544118 nova_compute[187283]: 2025-12-03 14:54:36.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:37 np0005544118 nova_compute[187283]: 2025-12-03 14:54:37.615 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:38 np0005544118 nova_compute[187283]: 2025-12-03 14:54:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:39 np0005544118 nova_compute[187283]: 2025-12-03 14:54:39.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:41 np0005544118 nova_compute[187283]: 2025-12-03 14:54:41.362 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:41 np0005544118 podman[224860]: 2025-12-03 14:54:41.867619334 +0000 UTC m=+0.094251076 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, architecture=x86_64, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.617 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.631 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.841 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.842 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5830MB free_disk=73.33361434936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.843 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.843 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.912 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.913 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:54:42 np0005544118 nova_compute[187283]: 2025-12-03 14:54:42.990 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.007 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.008 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.024 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.046 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.071 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.089 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.119 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:54:43 np0005544118 nova_compute[187283]: 2025-12-03 14:54:43.120 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:54:43 np0005544118 podman[224882]: 2025-12-03 14:54:43.84824574 +0000 UTC m=+0.069286783 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec  3 09:54:45 np0005544118 nova_compute[187283]: 2025-12-03 14:54:45.115 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:45 np0005544118 nova_compute[187283]: 2025-12-03 14:54:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:45 np0005544118 nova_compute[187283]: 2025-12-03 14:54:45.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:54:45 np0005544118 nova_compute[187283]: 2025-12-03 14:54:45.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:54:45 np0005544118 nova_compute[187283]: 2025-12-03 14:54:45.654 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:54:46 np0005544118 nova_compute[187283]: 2025-12-03 14:54:46.365 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:47 np0005544118 nova_compute[187283]: 2025-12-03 14:54:47.621 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:54:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:54:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:54:51 np0005544118 nova_compute[187283]: 2025-12-03 14:54:51.366 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:51 np0005544118 nova_compute[187283]: 2025-12-03 14:54:51.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:54:51 np0005544118 nova_compute[187283]: 2025-12-03 14:54:51.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:54:51 np0005544118 podman[224904]: 2025-12-03 14:54:51.835460211 +0000 UTC m=+0.060387226 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  3 09:54:52 np0005544118 nova_compute[187283]: 2025-12-03 14:54:52.623 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:56 np0005544118 nova_compute[187283]: 2025-12-03 14:54:56.367 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:54:56 np0005544118 podman[224923]: 2025-12-03 14:54:56.84689388 +0000 UTC m=+0.065440060 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 09:54:57 np0005544118 nova_compute[187283]: 2025-12-03 14:54:57.625 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:55:00.995 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:55:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:55:00.996 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:55:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:55:00.996 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:55:01 np0005544118 nova_compute[187283]: 2025-12-03 14:55:01.368 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:02 np0005544118 nova_compute[187283]: 2025-12-03 14:55:02.628 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:03 np0005544118 podman[224948]: 2025-12-03 14:55:03.938583774 +0000 UTC m=+0.167230296 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:55:05 np0005544118 podman[197639]: time="2025-12-03T14:55:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:55:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:55:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:55:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:55:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  3 09:55:06 np0005544118 nova_compute[187283]: 2025-12-03 14:55:06.370 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:07 np0005544118 nova_compute[187283]: 2025-12-03 14:55:07.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:11 np0005544118 nova_compute[187283]: 2025-12-03 14:55:11.374 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:12 np0005544118 nova_compute[187283]: 2025-12-03 14:55:12.633 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:12 np0005544118 podman[224974]: 2025-12-03 14:55:12.828382251 +0000 UTC m=+0.060830889 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., version=9.6)
Dec  3 09:55:14 np0005544118 podman[224998]: 2025-12-03 14:55:14.852003279 +0000 UTC m=+0.072091087 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec  3 09:55:16 np0005544118 nova_compute[187283]: 2025-12-03 14:55:16.375 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:17 np0005544118 nova_compute[187283]: 2025-12-03 14:55:17.635 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:55:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:55:21 np0005544118 nova_compute[187283]: 2025-12-03 14:55:21.377 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:22 np0005544118 nova_compute[187283]: 2025-12-03 14:55:22.637 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:22 np0005544118 podman[225018]: 2025-12-03 14:55:22.848484275 +0000 UTC m=+0.074859200 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec  3 09:55:26 np0005544118 nova_compute[187283]: 2025-12-03 14:55:26.379 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:27 np0005544118 nova_compute[187283]: 2025-12-03 14:55:27.641 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:27 np0005544118 podman[225037]: 2025-12-03 14:55:27.83941144 +0000 UTC m=+0.068145523 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:55:31 np0005544118 nova_compute[187283]: 2025-12-03 14:55:31.412 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:31 np0005544118 nova_compute[187283]: 2025-12-03 14:55:31.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:32 np0005544118 nova_compute[187283]: 2025-12-03 14:55:32.693 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:34 np0005544118 podman[225061]: 2025-12-03 14:55:34.859131971 +0000 UTC m=+0.097274807 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Dec  3 09:55:35 np0005544118 podman[197639]: time="2025-12-03T14:55:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:55:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:55:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:55:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:55:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec  3 09:55:36 np0005544118 nova_compute[187283]: 2025-12-03 14:55:36.514 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:37 np0005544118 nova_compute[187283]: 2025-12-03 14:55:37.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:37 np0005544118 nova_compute[187283]: 2025-12-03 14:55:37.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:37 np0005544118 nova_compute[187283]: 2025-12-03 14:55:37.696 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:39 np0005544118 nova_compute[187283]: 2025-12-03 14:55:39.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:41 np0005544118 nova_compute[187283]: 2025-12-03 14:55:41.515 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:41 np0005544118 nova_compute[187283]: 2025-12-03 14:55:41.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.699 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.950 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.950 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.951 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:55:42 np0005544118 nova_compute[187283]: 2025-12-03 14:55:42.951 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.122 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.123 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5838MB free_disk=73.33370208740234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.123 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.124 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.190 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.191 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.216 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.265 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.267 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:55:43 np0005544118 nova_compute[187283]: 2025-12-03 14:55:43.267 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:55:43 np0005544118 podman[225088]: 2025-12-03 14:55:43.874373302 +0000 UTC m=+0.092161241 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec  3 09:55:45 np0005544118 podman[225111]: 2025-12-03 14:55:45.856829567 +0000 UTC m=+0.086033878 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.263 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.544 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:55:46 np0005544118 nova_compute[187283]: 2025-12-03 14:55:46.752 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:55:47 np0005544118 nova_compute[187283]: 2025-12-03 14:55:47.737 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:55:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:55:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:55:50 np0005544118 nova_compute[187283]: 2025-12-03 14:55:50.748 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:51 np0005544118 nova_compute[187283]: 2025-12-03 14:55:51.577 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:51 np0005544118 nova_compute[187283]: 2025-12-03 14:55:51.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:55:51 np0005544118 nova_compute[187283]: 2025-12-03 14:55:51.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:55:52 np0005544118 nova_compute[187283]: 2025-12-03 14:55:52.740 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:53 np0005544118 podman[225131]: 2025-12-03 14:55:53.846803462 +0000 UTC m=+0.069689894 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec  3 09:55:56 np0005544118 nova_compute[187283]: 2025-12-03 14:55:56.580 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:57 np0005544118 nova_compute[187283]: 2025-12-03 14:55:57.743 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:55:58 np0005544118 podman[225150]: 2025-12-03 14:55:58.830437361 +0000 UTC m=+0.066505558 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:56:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:56:00.997 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:56:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:56:00.997 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:56:00 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:56:00.997 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:56:01 np0005544118 nova_compute[187283]: 2025-12-03 14:56:01.582 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:02 np0005544118 nova_compute[187283]: 2025-12-03 14:56:02.746 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:05 np0005544118 podman[197639]: time="2025-12-03T14:56:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:56:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:56:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:56:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:56:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec  3 09:56:05 np0005544118 podman[225175]: 2025-12-03 14:56:05.852917614 +0000 UTC m=+0.086416553 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec  3 09:56:06 np0005544118 nova_compute[187283]: 2025-12-03 14:56:06.627 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:07 np0005544118 nova_compute[187283]: 2025-12-03 14:56:07.748 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:11 np0005544118 nova_compute[187283]: 2025-12-03 14:56:11.630 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:12 np0005544118 nova_compute[187283]: 2025-12-03 14:56:12.751 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:14 np0005544118 podman[225202]: 2025-12-03 14:56:14.878639527 +0000 UTC m=+0.103616534 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=edpm, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:56:16 np0005544118 nova_compute[187283]: 2025-12-03 14:56:16.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:16 np0005544118 podman[225223]: 2025-12-03 14:56:16.835368478 +0000 UTC m=+0.068085562 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec  3 09:56:17 np0005544118 nova_compute[187283]: 2025-12-03 14:56:17.754 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:56:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:56:21 np0005544118 nova_compute[187283]: 2025-12-03 14:56:21.632 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:22 np0005544118 nova_compute[187283]: 2025-12-03 14:56:22.756 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:24 np0005544118 podman[225244]: 2025-12-03 14:56:24.85858692 +0000 UTC m=+0.074786976 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec  3 09:56:26 np0005544118 nova_compute[187283]: 2025-12-03 14:56:26.634 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:27 np0005544118 nova_compute[187283]: 2025-12-03 14:56:27.758 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:29 np0005544118 podman[225264]: 2025-12-03 14:56:29.826242627 +0000 UTC m=+0.058591583 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:56:31 np0005544118 nova_compute[187283]: 2025-12-03 14:56:31.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:31 np0005544118 nova_compute[187283]: 2025-12-03 14:56:31.637 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:32 np0005544118 nova_compute[187283]: 2025-12-03 14:56:32.760 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:35 np0005544118 podman[197639]: time="2025-12-03T14:56:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:56:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:56:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:56:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:56:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec  3 09:56:36 np0005544118 nova_compute[187283]: 2025-12-03 14:56:36.678 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:36 np0005544118 podman[225288]: 2025-12-03 14:56:36.874774487 +0000 UTC m=+0.104799175 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125)
Dec  3 09:56:37 np0005544118 nova_compute[187283]: 2025-12-03 14:56:37.802 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:38 np0005544118 nova_compute[187283]: 2025-12-03 14:56:38.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:39 np0005544118 nova_compute[187283]: 2025-12-03 14:56:39.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:41 np0005544118 nova_compute[187283]: 2025-12-03 14:56:41.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:41 np0005544118 nova_compute[187283]: 2025-12-03 14:56:41.701 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.643 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.644 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.644 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.645 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.835 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.921 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.923 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5835MB free_disk=73.33375930786133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.923 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:56:42 np0005544118 nova_compute[187283]: 2025-12-03 14:56:42.924 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.004 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.004 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.032 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.068 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.071 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:56:43 np0005544118 nova_compute[187283]: 2025-12-03 14:56:43.071 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.148s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:56:44 np0005544118 nova_compute[187283]: 2025-12-03 14:56:44.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:44 np0005544118 nova_compute[187283]: 2025-12-03 14:56:44.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 09:56:45 np0005544118 podman[225312]: 2025-12-03 14:56:45.895954685 +0000 UTC m=+0.113860283 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, version=9.6, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Dec  3 09:56:46 np0005544118 nova_compute[187283]: 2025-12-03 14:56:46.621 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:46 np0005544118 nova_compute[187283]: 2025-12-03 14:56:46.749 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:47 np0005544118 nova_compute[187283]: 2025-12-03 14:56:47.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:47 np0005544118 nova_compute[187283]: 2025-12-03 14:56:47.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:56:47 np0005544118 nova_compute[187283]: 2025-12-03 14:56:47.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:56:47 np0005544118 nova_compute[187283]: 2025-12-03 14:56:47.797 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:56:47 np0005544118 nova_compute[187283]: 2025-12-03 14:56:47.879 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:47 np0005544118 podman[225333]: 2025-12-03 14:56:47.925389073 +0000 UTC m=+0.140036708 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:56:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:56:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:56:51 np0005544118 nova_compute[187283]: 2025-12-03 14:56:51.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:51 np0005544118 nova_compute[187283]: 2025-12-03 14:56:51.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:56:51 np0005544118 nova_compute[187283]: 2025-12-03 14:56:51.753 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:52 np0005544118 nova_compute[187283]: 2025-12-03 14:56:52.881 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:55 np0005544118 podman[225353]: 2025-12-03 14:56:55.855374855 +0000 UTC m=+0.080888352 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec  3 09:56:56 np0005544118 nova_compute[187283]: 2025-12-03 14:56:56.755 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:56:57 np0005544118 nova_compute[187283]: 2025-12-03 14:56:57.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:56:57 np0005544118 nova_compute[187283]: 2025-12-03 14:56:57.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 09:56:57 np0005544118 nova_compute[187283]: 2025-12-03 14:56:57.641 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 09:56:57 np0005544118 nova_compute[187283]: 2025-12-03 14:56:57.906 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:00 np0005544118 podman[225372]: 2025-12-03 14:57:00.836345715 +0000 UTC m=+0.064487244 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 09:57:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:00.999 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:57:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:00.999 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:57:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:01.000 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:57:01 np0005544118 nova_compute[187283]: 2025-12-03 14:57:01.759 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:02 np0005544118 nova_compute[187283]: 2025-12-03 14:57:02.947 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:05 np0005544118 nova_compute[187283]: 2025-12-03 14:57:05.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:05 np0005544118 podman[197639]: time="2025-12-03T14:57:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:57:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:57:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:57:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:57:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2602 "" "Go-http-client/1.1"
Dec  3 09:57:06 np0005544118 nova_compute[187283]: 2025-12-03 14:57:06.792 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:07 np0005544118 podman[225397]: 2025-12-03 14:57:07.925247101 +0000 UTC m=+0.143905802 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 09:57:07 np0005544118 nova_compute[187283]: 2025-12-03 14:57:07.948 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:11 np0005544118 nova_compute[187283]: 2025-12-03 14:57:11.795 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:12 np0005544118 nova_compute[187283]: 2025-12-03 14:57:12.989 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:16 np0005544118 nova_compute[187283]: 2025-12-03 14:57:16.798 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:16 np0005544118 podman[225423]: 2025-12-03 14:57:16.833869912 +0000 UTC m=+0.069816509 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, release=1755695350, version=9.6, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Dec  3 09:57:17 np0005544118 nova_compute[187283]: 2025-12-03 14:57:17.991 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:18 np0005544118 podman[225444]: 2025-12-03 14:57:18.837571987 +0000 UTC m=+0.056214288 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:57:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:57:21 np0005544118 nova_compute[187283]: 2025-12-03 14:57:21.800 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:23 np0005544118 nova_compute[187283]: 2025-12-03 14:57:23.035 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:25 np0005544118 nova_compute[187283]: 2025-12-03 14:57:25.869 187287 DEBUG oslo_concurrency.processutils [None req-9f41af11-234d-4317-9c5b-3a3dad4d6033 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec  3 09:57:25 np0005544118 nova_compute[187283]: 2025-12-03 14:57:25.905 187287 DEBUG oslo_concurrency.processutils [None req-9f41af11-234d-4317-9c5b-3a3dad4d6033 bcf14db88a094daba66e320cb2d85dbf ecdb6bc2f490401f83229422485f1b7a - - default default] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec  3 09:57:26 np0005544118 nova_compute[187283]: 2025-12-03 14:57:26.801 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:26 np0005544118 podman[225465]: 2025-12-03 14:57:26.825891344 +0000 UTC m=+0.063083126 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec  3 09:57:28 np0005544118 nova_compute[187283]: 2025-12-03 14:57:28.038 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:31 np0005544118 nova_compute[187283]: 2025-12-03 14:57:31.804 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:31 np0005544118 podman[225484]: 2025-12-03 14:57:31.844496914 +0000 UTC m=+0.077422637 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 09:57:32 np0005544118 nova_compute[187283]: 2025-12-03 14:57:32.621 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:33 np0005544118 nova_compute[187283]: 2025-12-03 14:57:33.041 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:33 np0005544118 nova_compute[187283]: 2025-12-03 14:57:33.319 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:33.320 104491 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '42:47:f6', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'c6:3e:09:06:ee:2b'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec  3 09:57:33 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:33.321 104491 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec  3 09:57:35 np0005544118 podman[197639]: time="2025-12-03T14:57:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:57:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:57:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:57:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:57:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2606 "" "Go-http-client/1.1"
Dec  3 09:57:36 np0005544118 nova_compute[187283]: 2025-12-03 14:57:36.806 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:38 np0005544118 nova_compute[187283]: 2025-12-03 14:57:38.043 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:38 np0005544118 podman[225508]: 2025-12-03 14:57:38.879859094 +0000 UTC m=+0.115383465 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 09:57:39 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:57:39.325 104491 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac9297d1-94e5-43bb-91f9-3d345a639adf, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec  3 09:57:39 np0005544118 nova_compute[187283]: 2025-12-03 14:57:39.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:41 np0005544118 nova_compute[187283]: 2025-12-03 14:57:41.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:41 np0005544118 nova_compute[187283]: 2025-12-03 14:57:41.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:41 np0005544118 nova_compute[187283]: 2025-12-03 14:57:41.808 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:42 np0005544118 nova_compute[187283]: 2025-12-03 14:57:42.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.045 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.632 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.835 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.836 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5835MB free_disk=73.33377838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.836 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.837 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.985 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:57:43 np0005544118 nova_compute[187283]: 2025-12-03 14:57:43.986 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:57:44 np0005544118 nova_compute[187283]: 2025-12-03 14:57:44.019 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:57:44 np0005544118 nova_compute[187283]: 2025-12-03 14:57:44.041 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:57:44 np0005544118 nova_compute[187283]: 2025-12-03 14:57:44.043 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:57:44 np0005544118 nova_compute[187283]: 2025-12-03 14:57:44.044 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:57:46 np0005544118 nova_compute[187283]: 2025-12-03 14:57:46.811 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:47 np0005544118 podman[225534]: 2025-12-03 14:57:47.861769188 +0000 UTC m=+0.084955843 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.040 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.041 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.041 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.041 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.048 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:48 np0005544118 nova_compute[187283]: 2025-12-03 14:57:48.059 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:57:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:57:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:57:49 np0005544118 podman[225556]: 2025-12-03 14:57:49.834127105 +0000 UTC m=+0.072708368 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 09:57:50 np0005544118 nova_compute[187283]: 2025-12-03 14:57:50.620 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:51 np0005544118 nova_compute[187283]: 2025-12-03 14:57:51.813 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:53 np0005544118 nova_compute[187283]: 2025-12-03 14:57:53.050 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:53 np0005544118 nova_compute[187283]: 2025-12-03 14:57:53.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:57:53 np0005544118 nova_compute[187283]: 2025-12-03 14:57:53.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:57:56 np0005544118 nova_compute[187283]: 2025-12-03 14:57:56.817 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:57:57 np0005544118 podman[225576]: 2025-12-03 14:57:57.824937919 +0000 UTC m=+0.058122821 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  3 09:57:58 np0005544118 nova_compute[187283]: 2025-12-03 14:57:58.053 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:58:01.002 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:58:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:58:01.002 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:58:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:58:01.002 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:58:01 np0005544118 nova_compute[187283]: 2025-12-03 14:58:01.863 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:02 np0005544118 podman[225596]: 2025-12-03 14:58:02.870473128 +0000 UTC m=+0.095274077 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:58:03 np0005544118 nova_compute[187283]: 2025-12-03 14:58:03.055 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:05 np0005544118 podman[197639]: time="2025-12-03T14:58:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:58:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:58:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:58:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:58:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2604 "" "Go-http-client/1.1"
Dec  3 09:58:06 np0005544118 nova_compute[187283]: 2025-12-03 14:58:06.864 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:08 np0005544118 nova_compute[187283]: 2025-12-03 14:58:08.059 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:09 np0005544118 podman[225620]: 2025-12-03 14:58:09.84844928 +0000 UTC m=+0.079171431 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec  3 09:58:11 np0005544118 nova_compute[187283]: 2025-12-03 14:58:11.885 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:13 np0005544118 nova_compute[187283]: 2025-12-03 14:58:13.062 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:16 np0005544118 nova_compute[187283]: 2025-12-03 14:58:16.922 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:18 np0005544118 nova_compute[187283]: 2025-12-03 14:58:18.065 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:18 np0005544118 podman[225646]: 2025-12-03 14:58:18.828669171 +0000 UTC m=+0.060080335 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:58:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:58:20 np0005544118 podman[225667]: 2025-12-03 14:58:20.829404072 +0000 UTC m=+0.056792117 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec  3 09:58:21 np0005544118 nova_compute[187283]: 2025-12-03 14:58:21.976 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:23 np0005544118 nova_compute[187283]: 2025-12-03 14:58:23.094 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:26 np0005544118 nova_compute[187283]: 2025-12-03 14:58:26.979 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:28 np0005544118 nova_compute[187283]: 2025-12-03 14:58:28.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:28 np0005544118 podman[225690]: 2025-12-03 14:58:28.819002137 +0000 UTC m=+0.054579068 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec  3 09:58:31 np0005544118 nova_compute[187283]: 2025-12-03 14:58:31.981 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:33 np0005544118 nova_compute[187283]: 2025-12-03 14:58:33.099 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:33 np0005544118 podman[225710]: 2025-12-03 14:58:33.807784892 +0000 UTC m=+0.041375909 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec  3 09:58:34 np0005544118 nova_compute[187283]: 2025-12-03 14:58:34.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:35 np0005544118 podman[197639]: time="2025-12-03T14:58:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:58:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:58:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:58:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:58:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2608 "" "Go-http-client/1.1"
Dec  3 09:58:37 np0005544118 nova_compute[187283]: 2025-12-03 14:58:37.019 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:38 np0005544118 nova_compute[187283]: 2025-12-03 14:58:38.150 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:40 np0005544118 podman[225734]: 2025-12-03 14:58:40.851350359 +0000 UTC m=+0.085513388 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec  3 09:58:41 np0005544118 nova_compute[187283]: 2025-12-03 14:58:41.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:42 np0005544118 nova_compute[187283]: 2025-12-03 14:58:42.021 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:42 np0005544118 nova_compute[187283]: 2025-12-03 14:58:42.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.152 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.637 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.638 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.638 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.639 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.827 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.828 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5824MB free_disk=73.33375930786133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.828 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.828 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.897 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.897 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:58:43 np0005544118 nova_compute[187283]: 2025-12-03 14:58:43.981 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:58:44 np0005544118 nova_compute[187283]: 2025-12-03 14:58:44.010 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:58:44 np0005544118 nova_compute[187283]: 2025-12-03 14:58:44.013 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:58:44 np0005544118 nova_compute[187283]: 2025-12-03 14:58:44.013 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:58:45 np0005544118 nova_compute[187283]: 2025-12-03 14:58:45.014 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:47 np0005544118 nova_compute[187283]: 2025-12-03 14:58:47.024 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:48 np0005544118 nova_compute[187283]: 2025-12-03 14:58:48.154 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:48 np0005544118 nova_compute[187283]: 2025-12-03 14:58:48.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:48 np0005544118 nova_compute[187283]: 2025-12-03 14:58:48.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:58:48 np0005544118 nova_compute[187283]: 2025-12-03 14:58:48.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:58:48 np0005544118 nova_compute[187283]: 2025-12-03 14:58:48.644 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:58:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:58:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:58:49 np0005544118 nova_compute[187283]: 2025-12-03 14:58:49.638 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:49 np0005544118 podman[225760]: 2025-12-03 14:58:49.85228358 +0000 UTC m=+0.069399592 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec  3 09:58:51 np0005544118 podman[225781]: 2025-12-03 14:58:51.820401327 +0000 UTC m=+0.052801981 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec  3 09:58:52 np0005544118 nova_compute[187283]: 2025-12-03 14:58:52.026 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:53 np0005544118 nova_compute[187283]: 2025-12-03 14:58:53.157 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:54 np0005544118 nova_compute[187283]: 2025-12-03 14:58:54.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:58:54 np0005544118 nova_compute[187283]: 2025-12-03 14:58:54.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:58:57 np0005544118 nova_compute[187283]: 2025-12-03 14:58:57.027 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:58 np0005544118 nova_compute[187283]: 2025-12-03 14:58:58.160 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:58:59 np0005544118 podman[225801]: 2025-12-03 14:58:59.855264062 +0000 UTC m=+0.073340355 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec  3 09:59:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:59:01.006 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:59:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:59:01.007 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:59:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 14:59:01.007 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:59:02 np0005544118 nova_compute[187283]: 2025-12-03 14:59:02.032 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:03 np0005544118 nova_compute[187283]: 2025-12-03 14:59:03.163 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:04 np0005544118 podman[225822]: 2025-12-03 14:59:04.844112008 +0000 UTC m=+0.069671728 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:59:05 np0005544118 podman[197639]: time="2025-12-03T14:59:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:59:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:59:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:59:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:59:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2605 "" "Go-http-client/1.1"
Dec  3 09:59:07 np0005544118 nova_compute[187283]: 2025-12-03 14:59:07.036 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:08 np0005544118 nova_compute[187283]: 2025-12-03 14:59:08.170 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:11 np0005544118 podman[225847]: 2025-12-03 14:59:11.880527717 +0000 UTC m=+0.110868561 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:59:12 np0005544118 nova_compute[187283]: 2025-12-03 14:59:12.039 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:13 np0005544118 nova_compute[187283]: 2025-12-03 14:59:13.172 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:17 np0005544118 nova_compute[187283]: 2025-12-03 14:59:17.073 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:18 np0005544118 nova_compute[187283]: 2025-12-03 14:59:18.218 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:59:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:59:20 np0005544118 podman[225874]: 2025-12-03 14:59:20.832021316 +0000 UTC m=+0.060825873 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:59:22 np0005544118 nova_compute[187283]: 2025-12-03 14:59:22.075 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:22 np0005544118 podman[225896]: 2025-12-03 14:59:22.852490202 +0000 UTC m=+0.073177111 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec  3 09:59:23 np0005544118 nova_compute[187283]: 2025-12-03 14:59:23.222 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:27 np0005544118 nova_compute[187283]: 2025-12-03 14:59:27.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:28 np0005544118 nova_compute[187283]: 2025-12-03 14:59:28.226 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:30 np0005544118 podman[225917]: 2025-12-03 14:59:30.812394729 +0000 UTC m=+0.047765298 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec  3 09:59:32 np0005544118 nova_compute[187283]: 2025-12-03 14:59:32.078 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:33 np0005544118 nova_compute[187283]: 2025-12-03 14:59:33.228 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:35 np0005544118 podman[197639]: time="2025-12-03T14:59:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 09:59:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:59:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 09:59:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:14:59:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2597 "" "Go-http-client/1.1"
Dec  3 09:59:35 np0005544118 podman[225937]: 2025-12-03 14:59:35.860015883 +0000 UTC m=+0.082605211 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 09:59:36 np0005544118 nova_compute[187283]: 2025-12-03 14:59:36.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:37 np0005544118 nova_compute[187283]: 2025-12-03 14:59:37.081 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:38 np0005544118 nova_compute[187283]: 2025-12-03 14:59:38.275 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:42 np0005544118 nova_compute[187283]: 2025-12-03 14:59:42.084 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:42 np0005544118 podman[225962]: 2025-12-03 14:59:42.861698562 +0000 UTC m=+0.097645780 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 09:59:43 np0005544118 nova_compute[187283]: 2025-12-03 14:59:43.278 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:43 np0005544118 nova_compute[187283]: 2025-12-03 14:59:43.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:43 np0005544118 nova_compute[187283]: 2025-12-03 14:59:43.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.633 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.634 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.635 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.831 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.833 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5850MB free_disk=73.33377838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.833 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.891 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.892 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 09:59:44 np0005544118 nova_compute[187283]: 2025-12-03 14:59:44.987 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing inventories for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.002 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating ProviderTree inventory for provider 52e95542-7192-4eec-a5dc-18596ad73a72 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.003 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Updating inventory in ProviderTree for provider 52e95542-7192-4eec-a5dc-18596ad73a72 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.020 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing aggregate associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.050 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Refreshing trait associations for resource provider 52e95542-7192-4eec-a5dc-18596ad73a72, traits: COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_MMX,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_USB,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSSE3,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.077 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.098 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.100 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 09:59:45 np0005544118 nova_compute[187283]: 2025-12-03 14:59:45.101 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 09:59:46 np0005544118 nova_compute[187283]: 2025-12-03 14:59:46.103 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:47 np0005544118 nova_compute[187283]: 2025-12-03 14:59:47.086 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:48 np0005544118 nova_compute[187283]: 2025-12-03 14:59:48.279 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: ERROR   14:59:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 09:59:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 09:59:49 np0005544118 nova_compute[187283]: 2025-12-03 14:59:49.603 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:49 np0005544118 nova_compute[187283]: 2025-12-03 14:59:49.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:49 np0005544118 nova_compute[187283]: 2025-12-03 14:59:49.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 09:59:49 np0005544118 nova_compute[187283]: 2025-12-03 14:59:49.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 09:59:49 np0005544118 nova_compute[187283]: 2025-12-03 14:59:49.622 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 09:59:50 np0005544118 nova_compute[187283]: 2025-12-03 14:59:50.619 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:51 np0005544118 podman[225989]: 2025-12-03 14:59:51.83047851 +0000 UTC m=+0.062448797 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 09:59:52 np0005544118 nova_compute[187283]: 2025-12-03 14:59:52.089 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:53 np0005544118 nova_compute[187283]: 2025-12-03 14:59:53.282 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:53 np0005544118 podman[226011]: 2025-12-03 14:59:53.857373085 +0000 UTC m=+0.074037004 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec  3 09:59:55 np0005544118 nova_compute[187283]: 2025-12-03 14:59:55.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 09:59:55 np0005544118 nova_compute[187283]: 2025-12-03 14:59:55.607 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 09:59:57 np0005544118 nova_compute[187283]: 2025-12-03 14:59:57.089 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 09:59:58 np0005544118 nova_compute[187283]: 2025-12-03 14:59:58.285 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:00:01.007 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:00:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:00:01.007 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:00:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:00:01.008 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:00:01 np0005544118 podman[226031]: 2025-12-03 15:00:01.837304192 +0000 UTC m=+0.062557680 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 10:00:02 np0005544118 nova_compute[187283]: 2025-12-03 15:00:02.094 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:03 np0005544118 nova_compute[187283]: 2025-12-03 15:00:03.288 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:05 np0005544118 podman[197639]: time="2025-12-03T15:00:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:00:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:00:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:00:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:00:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2600 "" "Go-http-client/1.1"
Dec  3 10:00:06 np0005544118 podman[226050]: 2025-12-03 15:00:06.839881761 +0000 UTC m=+0.066612747 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 10:00:07 np0005544118 nova_compute[187283]: 2025-12-03 15:00:07.093 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:08 np0005544118 nova_compute[187283]: 2025-12-03 15:00:08.291 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:12 np0005544118 nova_compute[187283]: 2025-12-03 15:00:12.094 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:13 np0005544118 nova_compute[187283]: 2025-12-03 15:00:13.293 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:13 np0005544118 podman[226074]: 2025-12-03 15:00:13.913883047 +0000 UTC m=+0.147542064 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec  3 10:00:17 np0005544118 nova_compute[187283]: 2025-12-03 15:00:17.096 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:18 np0005544118 nova_compute[187283]: 2025-12-03 15:00:18.296 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 10:00:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:00:22 np0005544118 nova_compute[187283]: 2025-12-03 15:00:22.124 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:22 np0005544118 podman[226101]: 2025-12-03 15:00:22.851911879 +0000 UTC m=+0.076706684 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec  3 10:00:23 np0005544118 nova_compute[187283]: 2025-12-03 15:00:23.299 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:24 np0005544118 podman[226123]: 2025-12-03 15:00:24.849230021 +0000 UTC m=+0.077186018 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 10:00:27 np0005544118 nova_compute[187283]: 2025-12-03 15:00:27.160 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:28 np0005544118 nova_compute[187283]: 2025-12-03 15:00:28.353 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:32 np0005544118 nova_compute[187283]: 2025-12-03 15:00:32.206 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:32 np0005544118 podman[226143]: 2025-12-03 15:00:32.851909342 +0000 UTC m=+0.071736954 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec  3 10:00:33 np0005544118 nova_compute[187283]: 2025-12-03 15:00:33.403 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:35 np0005544118 podman[197639]: time="2025-12-03T15:00:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:00:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:00:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:00:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:00:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 10:00:36 np0005544118 nova_compute[187283]: 2025-12-03 15:00:36.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:37 np0005544118 nova_compute[187283]: 2025-12-03 15:00:37.208 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:37 np0005544118 podman[226163]: 2025-12-03 15:00:37.840336735 +0000 UTC m=+0.074441165 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec  3 10:00:38 np0005544118 nova_compute[187283]: 2025-12-03 15:00:38.443 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:42 np0005544118 nova_compute[187283]: 2025-12-03 15:00:42.264 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:43 np0005544118 nova_compute[187283]: 2025-12-03 15:00:43.484 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:43 np0005544118 nova_compute[187283]: 2025-12-03 15:00:43.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:44 np0005544118 nova_compute[187283]: 2025-12-03 15:00:44.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:44 np0005544118 nova_compute[187283]: 2025-12-03 15:00:44.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:44 np0005544118 podman[226188]: 2025-12-03 15:00:44.913222823 +0000 UTC m=+0.132493735 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.630 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.631 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.631 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.805 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.806 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5843MB free_disk=73.33398056030273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.806 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:00:45 np0005544118 nova_compute[187283]: 2025-12-03 15:00:45.806 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.030 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.031 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.057 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.152 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.154 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 10:00:46 np0005544118 nova_compute[187283]: 2025-12-03 15:00:46.155 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:00:47 np0005544118 nova_compute[187283]: 2025-12-03 15:00:47.267 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:48 np0005544118 nova_compute[187283]: 2025-12-03 15:00:48.155 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:48 np0005544118 nova_compute[187283]: 2025-12-03 15:00:48.528 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:00:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 10:00:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:00:49 np0005544118 nova_compute[187283]: 2025-12-03 15:00:49.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:51 np0005544118 nova_compute[187283]: 2025-12-03 15:00:51.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:51 np0005544118 nova_compute[187283]: 2025-12-03 15:00:51.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 10:00:51 np0005544118 nova_compute[187283]: 2025-12-03 15:00:51.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 10:00:51 np0005544118 nova_compute[187283]: 2025-12-03 15:00:51.638 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 10:00:52 np0005544118 nova_compute[187283]: 2025-12-03 15:00:52.268 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:53 np0005544118 nova_compute[187283]: 2025-12-03 15:00:53.530 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:53 np0005544118 podman[226214]: 2025-12-03 15:00:53.840121801 +0000 UTC m=+0.068299172 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=edpm, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec  3 10:00:55 np0005544118 podman[226238]: 2025-12-03 15:00:55.86452731 +0000 UTC m=+0.091859366 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3)
Dec  3 10:00:56 np0005544118 nova_compute[187283]: 2025-12-03 15:00:56.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:00:56 np0005544118 nova_compute[187283]: 2025-12-03 15:00:56.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 10:00:57 np0005544118 nova_compute[187283]: 2025-12-03 15:00:57.270 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:00:58 np0005544118 nova_compute[187283]: 2025-12-03 15:00:58.532 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:01:01.009 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:01:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:01:01.010 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:01:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:01:01.010 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:01:02 np0005544118 nova_compute[187283]: 2025-12-03 15:01:02.302 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:03 np0005544118 nova_compute[187283]: 2025-12-03 15:01:03.535 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:03 np0005544118 podman[226271]: 2025-12-03 15:01:03.853467136 +0000 UTC m=+0.077852594 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec  3 10:01:05 np0005544118 podman[197639]: time="2025-12-03T15:01:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:01:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:01:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:01:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:01:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 10:01:07 np0005544118 nova_compute[187283]: 2025-12-03 15:01:07.302 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:08 np0005544118 nova_compute[187283]: 2025-12-03 15:01:08.538 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:08 np0005544118 podman[226291]: 2025-12-03 15:01:08.832185363 +0000 UTC m=+0.058602595 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec  3 10:01:12 np0005544118 nova_compute[187283]: 2025-12-03 15:01:12.305 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:13 np0005544118 nova_compute[187283]: 2025-12-03 15:01:13.540 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:15 np0005544118 podman[226315]: 2025-12-03 15:01:15.85951197 +0000 UTC m=+0.091496967 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 10:01:17 np0005544118 nova_compute[187283]: 2025-12-03 15:01:17.348 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:18 np0005544118 nova_compute[187283]: 2025-12-03 15:01:18.584 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 10:01:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:01:22 np0005544118 nova_compute[187283]: 2025-12-03 15:01:22.349 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:23 np0005544118 nova_compute[187283]: 2025-12-03 15:01:23.586 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:24 np0005544118 podman[226341]: 2025-12-03 15:01:24.819974058 +0000 UTC m=+0.052867263 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Dec  3 10:01:26 np0005544118 podman[226362]: 2025-12-03 15:01:26.85129832 +0000 UTC m=+0.081368039 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2)
Dec  3 10:01:27 np0005544118 nova_compute[187283]: 2025-12-03 15:01:27.350 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:28 np0005544118 nova_compute[187283]: 2025-12-03 15:01:28.588 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:32 np0005544118 nova_compute[187283]: 2025-12-03 15:01:32.410 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:33 np0005544118 nova_compute[187283]: 2025-12-03 15:01:33.590 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:34 np0005544118 podman[226382]: 2025-12-03 15:01:34.818197572 +0000 UTC m=+0.051469156 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Dec  3 10:01:35 np0005544118 podman[197639]: time="2025-12-03T15:01:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:01:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:01:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:01:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:01:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 10:01:37 np0005544118 nova_compute[187283]: 2025-12-03 15:01:37.451 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:38 np0005544118 nova_compute[187283]: 2025-12-03 15:01:38.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:38 np0005544118 nova_compute[187283]: 2025-12-03 15:01:38.641 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:39 np0005544118 podman[226402]: 2025-12-03 15:01:39.832559095 +0000 UTC m=+0.064934632 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec  3 10:01:42 np0005544118 nova_compute[187283]: 2025-12-03 15:01:42.511 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:43 np0005544118 nova_compute[187283]: 2025-12-03 15:01:43.681 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:44 np0005544118 nova_compute[187283]: 2025-12-03 15:01:44.606 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:44 np0005544118 nova_compute[187283]: 2025-12-03 15:01:44.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.635 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.636 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.636 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.820 187287 WARNING nova.virt.libvirt.driver [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.820 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5843MB free_disk=73.33377838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.821 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.821 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.908 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.909 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.929 187287 DEBUG nova.compute.provider_tree [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed in ProviderTree for provider: 52e95542-7192-4eec-a5dc-18596ad73a72 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.954 187287 DEBUG nova.scheduler.client.report [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Inventory has not changed for provider 52e95542-7192-4eec-a5dc-18596ad73a72 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.956 187287 DEBUG nova.compute.resource_tracker [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec  3 10:01:45 np0005544118 nova_compute[187283]: 2025-12-03 15:01:45.956 187287 DEBUG oslo_concurrency.lockutils [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:01:46 np0005544118 podman[226429]: 2025-12-03 15:01:46.892215991 +0000 UTC m=+0.118331488 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec  3 10:01:47 np0005544118 nova_compute[187283]: 2025-12-03 15:01:47.100 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:47 np0005544118 nova_compute[187283]: 2025-12-03 15:01:47.561 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:48 np0005544118 nova_compute[187283]: 2025-12-03 15:01:48.725 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:49 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:49 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:49 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: ERROR   15:01:49 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 10:01:49 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:01:49 np0005544118 nova_compute[187283]: 2025-12-03 15:01:49.646 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:51 np0005544118 nova_compute[187283]: 2025-12-03 15:01:51.602 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:52 np0005544118 nova_compute[187283]: 2025-12-03 15:01:52.611 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:52 np0005544118 nova_compute[187283]: 2025-12-03 15:01:52.613 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec  3 10:01:52 np0005544118 nova_compute[187283]: 2025-12-03 15:01:52.613 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec  3 10:01:52 np0005544118 nova_compute[187283]: 2025-12-03 15:01:52.616 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:52 np0005544118 nova_compute[187283]: 2025-12-03 15:01:52.637 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec  3 10:01:53 np0005544118 nova_compute[187283]: 2025-12-03 15:01:53.727 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:55 np0005544118 nova_compute[187283]: 2025-12-03 15:01:55.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:55 np0005544118 nova_compute[187283]: 2025-12-03 15:01:55.719 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:55 np0005544118 nova_compute[187283]: 2025-12-03 15:01:55.719 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec  3 10:01:55 np0005544118 podman[226455]: 2025-12-03 15:01:55.862350057 +0000 UTC m=+0.072366140 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Dec  3 10:01:57 np0005544118 nova_compute[187283]: 2025-12-03 15:01:57.614 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:01:57 np0005544118 nova_compute[187283]: 2025-12-03 15:01:57.707 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:01:57 np0005544118 nova_compute[187283]: 2025-12-03 15:01:57.707 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec  3 10:01:57 np0005544118 podman[226476]: 2025-12-03 15:01:57.826338754 +0000 UTC m=+0.057941947 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible)
Dec  3 10:01:58 np0005544118 nova_compute[187283]: 2025-12-03 15:01:58.729 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:02:01.010 104491 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec  3 10:02:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:02:01.011 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec  3 10:02:01 np0005544118 ovn_metadata_agent[104486]: 2025-12-03 15:02:01.011 104491 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec  3 10:02:02 np0005544118 nova_compute[187283]: 2025-12-03 15:02:02.621 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:03 np0005544118 nova_compute[187283]: 2025-12-03 15:02:03.772 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:04 np0005544118 nova_compute[187283]: 2025-12-03 15:02:04.608 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:02:04 np0005544118 nova_compute[187283]: 2025-12-03 15:02:04.608 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec  3 10:02:04 np0005544118 nova_compute[187283]: 2025-12-03 15:02:04.623 187287 DEBUG nova.compute.manager [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec  3 10:02:05 np0005544118 podman[197639]: time="2025-12-03T15:02:05Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:02:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:02:05 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:02:05 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:02:05 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 10:02:05 np0005544118 podman[226496]: 2025-12-03 15:02:05.851397338 +0000 UTC m=+0.070776228 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, tcib_managed=true)
Dec  3 10:02:06 np0005544118 nova_compute[187283]: 2025-12-03 15:02:06.607 187287 DEBUG oslo_service.periodic_task [None req-c4fbcbfa-ba0f-498f-8d89-3383705f74ae - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec  3 10:02:07 np0005544118 nova_compute[187283]: 2025-12-03 15:02:07.624 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:08 np0005544118 nova_compute[187283]: 2025-12-03 15:02:08.775 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:10 np0005544118 podman[226517]: 2025-12-03 15:02:10.816406212 +0000 UTC m=+0.050358926 container health_status 910230440e3d0cdb7857ad13430ee4f6e6480731278e889257da86082dd1aa7b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec  3 10:02:12 np0005544118 nova_compute[187283]: 2025-12-03 15:02:12.625 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:13 np0005544118 nova_compute[187283]: 2025-12-03 15:02:13.776 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:17 np0005544118 nova_compute[187283]: 2025-12-03 15:02:17.628 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:17 np0005544118 podman[226541]: 2025-12-03 15:02:17.87498542 +0000 UTC m=+0.107698287 container health_status 6fdd6abc9af77258c8a1ca5fbb1680807b64bd745af5ed3474a889cb46efcbd1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd)
Dec  3 10:02:18 np0005544118 nova_compute[187283]: 2025-12-03 15:02:18.790 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:02:19 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:02:19 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:02:19 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: ERROR   15:02:19 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec  3 10:02:19 np0005544118 openstack_network_exporter[199786]: 
Dec  3 10:02:22 np0005544118 nova_compute[187283]: 2025-12-03 15:02:22.628 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:23 np0005544118 nova_compute[187283]: 2025-12-03 15:02:23.849 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:26 np0005544118 podman[226567]: 2025-12-03 15:02:26.823262735 +0000 UTC m=+0.050551622 container health_status ec8ebdbc65eff4494539dd6807ec683077c516bfe15f699a18723bbb52ef2833 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_id=edpm, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec  3 10:02:27 np0005544118 systemd-logind[795]: New session 39 of user zuul.
Dec  3 10:02:27 np0005544118 systemd[1]: Started Session 39 of User zuul.
Dec  3 10:02:27 np0005544118 nova_compute[187283]: 2025-12-03 15:02:27.631 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:28 np0005544118 podman[226627]: 2025-12-03 15:02:28.350033549 +0000 UTC m=+0.065776885 container health_status 6dfd51a6c41f9acb8b85d7ccd9e97f28702454d91f02bfdd802f33323cfa9115 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec  3 10:02:28 np0005544118 nova_compute[187283]: 2025-12-03 15:02:28.851 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:31 np0005544118 ovs-vsctl[226788]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec  3 10:02:32 np0005544118 nova_compute[187283]: 2025-12-03 15:02:32.633 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:32 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec  3 10:02:32 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec  3 10:02:32 np0005544118 virtqemud[186958]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec  3 10:02:33 np0005544118 nova_compute[187283]: 2025-12-03 15:02:33.908 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec  3 10:02:35 np0005544118 podman[197639]: time="2025-12-03T15:02:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec  3 10:02:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:02:35 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 17091 "" "Go-http-client/1.1"
Dec  3 10:02:35 np0005544118 podman[197639]: @ - - [03/Dec/2025:15:02:35 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2603 "" "Go-http-client/1.1"
Dec  3 10:02:36 np0005544118 systemd[1]: Starting Hostname Service...
Dec  3 10:02:36 np0005544118 podman[227323]: 2025-12-03 15:02:36.090856977 +0000 UTC m=+0.068595290 container health_status f5ce0a6ebdfbcfd01d81962363d78814803b75cd97eb7cda6fb45251c0751c4c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=fa2bb8efef6782c26ea7f1675eeb36dd, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec  3 10:02:36 np0005544118 systemd[1]: Started Hostname Service.
Dec  3 10:02:37 np0005544118 nova_compute[187283]: 2025-12-03 15:02:37.635 187287 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
